Apr 16 20:57:52.106733 ip-10-0-141-171 systemd[1]: Starting Kubernetes Kubelet... Apr 16 20:57:52.500879 ip-10-0-141-171 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:57:52.500879 ip-10-0-141-171 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 20:57:52.500879 ip-10-0-141-171 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:57:52.500879 ip-10-0-141-171 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 20:57:52.500879 ip-10-0-141-171 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:57:52.501605 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.501513 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 20:57:52.508503 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508473 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:57:52.508503 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508495 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:57:52.508503 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508502 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:57:52.508503 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508506 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:57:52.508503 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508509 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:57:52.508503 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508512 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508517 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508520 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508523 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508526 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508529 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508531 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508534 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508537 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508540 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508545 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508548 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508551 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508553 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508556 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508560 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508563 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508565 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508570 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508574 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:57:52.508741 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508577 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508580 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508583 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508588 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508591 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508594 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508597 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508600 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508604 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508608 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508611 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508615 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508617 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508620 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508624 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508630 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508633 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508636 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508640 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508643 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:57:52.509243 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508645 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508648 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508651 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508654 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508656 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508659 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508662 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508667 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508669 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508672 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508675 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508677 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508680 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508682 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508685 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508688 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508691 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508693 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508696 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508701 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:57:52.509770 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508703 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508706 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508712 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508714 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508717 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508720 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508724 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508727 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508731 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508734 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508737 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508742 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508745 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508748 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508757 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508760 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508763 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508766 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508769 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:57:52.510253 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508772 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.508774 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509392 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509397 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509399 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509402 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509405 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509407 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509411 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509413 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509416 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509419 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509424 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509427 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509429 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509433 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509451 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509455 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509459 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509464 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:57:52.510736 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509468 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509472 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509475 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509477 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509480 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509485 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509488 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509491 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509494 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509496 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509499 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509502 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509505 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509508 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509512 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509514 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509520 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509522 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509527 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:57:52.511245 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509530 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509533 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509536 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509539 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509542 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509545 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509548 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509551 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509553 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509557 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509562 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509565 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509569 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509572 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509574 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509577 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509580 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509582 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509585 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509588 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:57:52.511779 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509590 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509593 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509598 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509600 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509605 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509607 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509610 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509613 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509615 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509618 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509620 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509623 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509626 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509629 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509634 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509636 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509639 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509641 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509645 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:57:52.512259 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509649 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509652 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509654 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509657 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509660 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509663 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509666 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509668 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509673 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.509676 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511044 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511069 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511078 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511083 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511088 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511091 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511096 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511101 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511104 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511107 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511110 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511114 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 20:57:52.512752 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511117 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511120 2575 flags.go:64] FLAG: --cgroup-root="" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511123 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511126 2575 flags.go:64] FLAG: --client-ca-file="" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511129 2575 flags.go:64] FLAG: --cloud-config="" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511132 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511135 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511141 2575 flags.go:64] FLAG: --cluster-domain="" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511144 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511147 2575 flags.go:64] FLAG: --config-dir="" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511150 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511153 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511157 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511160 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511164 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511167 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511171 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511174 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511177 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511181 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511184 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511189 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511192 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511195 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511198 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 20:57:52.513293 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511202 2575 flags.go:64] FLAG: --enable-server="true" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511204 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511210 2575 flags.go:64] FLAG: --event-burst="100" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511213 2575 flags.go:64] FLAG: --event-qps="50" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511216 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511219 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511222 2575 flags.go:64] FLAG: --eviction-hard="" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511226 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511229 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511233 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511236 2575 flags.go:64] FLAG: --eviction-soft="" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511239 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511242 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511244 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511248 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511251 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511255 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511258 2575 flags.go:64] FLAG: --feature-gates="" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511262 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511265 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511268 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511271 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511274 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511277 2575 flags.go:64] FLAG: --help="false" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511280 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-141-171.ec2.internal" Apr 16 20:57:52.513915 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511283 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511287 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511290 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511294 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511299 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511301 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511304 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511307 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511311 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511314 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511317 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511320 2575 flags.go:64] FLAG: --kube-reserved="" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511323 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511326 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511329 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511332 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511335 2575 flags.go:64] FLAG: --lock-file="" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511338 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511341 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511344 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511349 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511353 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511356 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511359 2575 flags.go:64] FLAG: --logging-format="text" Apr 16 20:57:52.514525 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511362 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511366 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511369 2575 flags.go:64] FLAG: --manifest-url="" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511372 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511382 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511385 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511390 2575 flags.go:64] FLAG: --max-pods="110" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511393 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511396 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511400 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511403 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511406 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511408 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511412 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511421 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511424 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511427 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511430 2575 flags.go:64] FLAG: --pod-cidr="" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511433 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511454 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511457 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511461 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511464 2575 flags.go:64] FLAG: --port="10250" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511467 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 20:57:52.515184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511470 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-065a0fbf16f6f9567" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511473 2575 flags.go:64] FLAG: --qos-reserved="" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511476 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511479 2575 flags.go:64] FLAG: --register-node="true" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511482 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511485 2575 flags.go:64] FLAG: --register-with-taints="" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511489 2575 flags.go:64] FLAG: --registry-burst="10" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511492 2575 flags.go:64] FLAG: --registry-qps="5" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511494 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511497 2575 flags.go:64] FLAG: --reserved-memory="" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511501 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511505 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511508 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511511 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511513 2575 flags.go:64] FLAG: --runonce="false" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511516 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511520 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511523 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511527 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511530 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511533 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511537 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511540 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511543 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511545 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511549 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 20:57:52.515783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511552 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511555 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511558 2575 flags.go:64] FLAG: --system-cgroups="" Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511561 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511568 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511570 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511573 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511578 2575 flags.go:64] FLAG: --tls-min-version="" Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511581 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511584 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511587 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511590 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511593 2575 flags.go:64] FLAG: --v="2" Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511597 2575 flags.go:64] FLAG: --version="false" Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511608 2575 flags.go:64] FLAG: --vmodule="" Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511613 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.511616 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511725 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511729 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511733 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511735 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511739 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511742 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511745 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:57:52.516426 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511748 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511751 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511754 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511757 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511759 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511762 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511765 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511767 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511770 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511773 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511776 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511778 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511781 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511784 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511787 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511789 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511792 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511795 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511798 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:57:52.517033 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511801 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511803 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511806 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511808 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511812 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511816 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511819 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511822 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511824 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511827 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511830 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511832 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511834 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511838 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511842 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511845 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511848 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511851 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511853 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:57:52.517530 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511856 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511858 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511861 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511863 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511866 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511869 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511872 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511875 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511877 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511879 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511882 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511885 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511887 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511890 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511892 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511895 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511897 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511900 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511902 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511905 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:57:52.518043 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511907 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511910 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511913 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511915 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511918 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511920 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511923 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511926 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511928 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511931 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511933 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511935 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511938 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511940 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511943 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511946 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511948 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511951 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511953 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511956 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:57:52.518555 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.511958 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:57:52.519046 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.512537 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:57:52.519046 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.518953 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 20:57:52.519046 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.518970 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 20:57:52.519046 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519019 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:57:52.519046 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519024 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:57:52.519046 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519027 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:57:52.519046 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519030 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:57:52.519046 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519034 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:57:52.519046 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519037 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:57:52.519046 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519040 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:57:52.519046 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519042 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:57:52.519046 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519045 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:57:52.519046 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519048 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:57:52.519046 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519051 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:57:52.519046 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519053 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519056 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519059 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519062 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519065 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519067 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519070 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519072 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519075 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519078 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519081 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519083 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519086 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519088 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519091 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519094 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519096 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519099 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519101 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519104 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:57:52.519421 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519107 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519110 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519112 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519115 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519117 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519120 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519122 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519125 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519127 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519130 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519133 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519138 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519141 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519143 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519147 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519150 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519153 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519156 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519159 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519161 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:57:52.519944 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519164 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519166 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519169 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519171 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519174 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519176 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519179 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519181 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519184 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519187 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519190 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519192 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519195 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519198 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519201 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519203 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519206 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519208 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519211 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519213 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:57:52.520432 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519216 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:57:52.520952 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519218 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:57:52.520952 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519221 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:57:52.520952 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519223 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:57:52.520952 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519225 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:57:52.520952 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519228 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:57:52.520952 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519232 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:57:52.520952 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519234 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:57:52.520952 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519237 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:57:52.520952 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519239 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:57:52.520952 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519242 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:57:52.520952 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519245 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:57:52.520952 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519248 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:57:52.520952 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519251 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:57:52.520952 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519255 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:57:52.520952 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.519261 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:57:52.520952 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519359 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519364 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519367 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519369 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519372 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519375 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519378 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519380 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519383 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519385 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519388 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519391 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519395 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519399 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519402 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519405 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519408 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519411 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519413 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:57:52.521351 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519416 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519419 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519421 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519424 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519427 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519431 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519449 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519454 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519458 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519462 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519464 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519467 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519469 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519472 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519474 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519477 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519480 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519482 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519485 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519487 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:57:52.521845 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519490 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519492 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519496 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519500 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519508 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519511 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519514 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519516 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519519 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519522 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519524 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519527 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519529 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519532 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519534 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519537 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519540 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519542 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519545 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519547 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:57:52.522331 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519550 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519552 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519555 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519557 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519560 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519563 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519566 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519568 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519570 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519573 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519575 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519578 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519580 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519583 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519585 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519588 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519591 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519594 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519597 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519606 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:57:52.522823 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519610 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:57:52.523413 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519612 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:57:52.523413 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519615 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:57:52.523413 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519618 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:57:52.523413 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519620 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:57:52.523413 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519623 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:57:52.523413 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:52.519626 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:57:52.523413 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.519631 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:57:52.523413 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.520322 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 20:57:52.523413 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.522903 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 20:57:52.523678 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.523643 2575 server.go:1019] "Starting client certificate rotation" Apr 16 20:57:52.523765 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.523747 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:57:52.523823 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.523804 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:57:52.545802 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.545771 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:57:52.548174 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.548143 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:57:52.559146 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.559129 2575 log.go:25] "Validated CRI v1 runtime API" Apr 16 20:57:52.564023 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.564005 2575 log.go:25] "Validated CRI v1 image API" Apr 16 20:57:52.565232 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.565208 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 20:57:52.569638 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.569616 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 caef7d08-f88a-4f45-bec1-bf00ed614fb7:/dev/nvme0n1p3 cb2d7eff-908a-436e-895b-03fa42334201:/dev/nvme0n1p4] Apr 16 20:57:52.569703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.569638 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 20:57:52.576789 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.576672 2575 manager.go:217] Machine: {Timestamp:2026-04-16 20:57:52.57499752 +0000 UTC m=+0.364614517 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096228 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28358623e02ac239a7ec313bfdec8d SystemUUID:ec283586-23e0-2ac2-39a7-ec313bfdec8d BootID:f62e42e3-16e6-45c8-a1f6-cf99c9194a25 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:db:46:f4:b2:9d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:db:46:f4:b2:9d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9e:1b:a6:60:4b:03 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 20:57:52.576789 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.576776 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 20:57:52.576927 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.576906 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 20:57:52.579359 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.579333 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 20:57:52.579526 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.579363 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-171.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 20:57:52.579577 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.579536 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 20:57:52.579577 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.579545 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 20:57:52.579577 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.579558 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:57:52.580198 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.580186 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:57:52.581653 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.581642 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:57:52.581776 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.581767 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 20:57:52.582089 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.582073 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:57:52.584001 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.583989 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 16 20:57:52.584089 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.584007 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 20:57:52.584089 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.584021 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 20:57:52.584089 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.584030 2575 kubelet.go:397] "Adding apiserver pod source" Apr 16 20:57:52.584089 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.584040 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 20:57:52.585149 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.585135 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:57:52.585203 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.585161 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:57:52.587853 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.587838 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 20:57:52.589483 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.589470 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 20:57:52.590510 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.590499 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 20:57:52.590548 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.590516 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 20:57:52.590548 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.590522 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 20:57:52.590548 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.590528 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 20:57:52.590548 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.590534 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 20:57:52.590548 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.590539 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 20:57:52.590548 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.590545 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 20:57:52.590704 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.590551 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 20:57:52.590704 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.590560 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 20:57:52.590704 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.590566 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 20:57:52.590704 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.590580 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 20:57:52.590704 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.590589 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 20:57:52.591216 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.591203 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 20:57:52.591275 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.591219 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 20:57:52.594811 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.594795 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 20:57:52.594876 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.594839 2575 server.go:1295] "Started kubelet" Apr 16 20:57:52.595010 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.594931 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 20:57:52.595071 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.594969 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 20:57:52.595071 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.595067 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 20:57:52.595923 ip-10-0-141-171 systemd[1]: Started Kubernetes Kubelet. Apr 16 20:57:52.596755 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.596523 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 20:57:52.597380 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.597361 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 16 20:57:52.597488 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.597425 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-171.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 20:57:52.597661 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.597639 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 20:57:52.597661 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.597633 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-171.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 20:57:52.602653 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.602627 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 20:57:52.602862 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.602850 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 20:57:52.602862 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.602858 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 20:57:52.603465 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.603421 2575 factory.go:55] Registering systemd factory Apr 16 20:57:52.603465 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.603459 2575 factory.go:223] Registration of the systemd container factory successfully Apr 16 20:57:52.603465 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.603462 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 20:57:52.603723 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.603463 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 20:57:52.603723 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.603595 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 20:57:52.603723 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.603661 2575 factory.go:153] Registering CRI-O factory Apr 16 20:57:52.603723 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.603677 2575 factory.go:223] Registration of the crio container factory successfully Apr 16 20:57:52.603723 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.603712 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-171.ec2.internal\" not found" Apr 16 20:57:52.603723 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.603730 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 16 20:57:52.604017 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.603739 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 16 20:57:52.604017 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.603768 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 20:57:52.604017 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.603800 2575 factory.go:103] Registering Raw factory Apr 16 20:57:52.604017 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.603813 2575 manager.go:1196] Started watching for new ooms in manager Apr 16 20:57:52.604212 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.604114 2575 manager.go:319] Starting recovery of all containers Apr 16 20:57:52.607684 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.604775 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-171.ec2.internal.18a6f1e5dc953432 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-171.ec2.internal,UID:ip-10-0-141-171.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-171.ec2.internal,},FirstTimestamp:2026-04-16 20:57:52.594809906 +0000 UTC m=+0.384426904,LastTimestamp:2026-04-16 20:57:52.594809906 +0000 UTC m=+0.384426904,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-171.ec2.internal,}" Apr 16 20:57:52.611102 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.611059 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-171.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 20:57:52.611238 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.611208 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 20:57:52.620254 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.620233 2575 manager.go:324] Recovery completed Apr 16 20:57:52.624463 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.624450 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:57:52.626772 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.626752 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:57:52.626849 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.626781 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:57:52.626849 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.626792 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:57:52.627275 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.627261 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 20:57:52.627275 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.627274 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 20:57:52.627355 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.627292 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:57:52.629137 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.629075 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-171.ec2.internal.18a6f1e5de7cdbba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-171.ec2.internal,UID:ip-10-0-141-171.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-141-171.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-141-171.ec2.internal,},FirstTimestamp:2026-04-16 20:57:52.626768826 +0000 UTC m=+0.416385821,LastTimestamp:2026-04-16 20:57:52.626768826 +0000 UTC m=+0.416385821,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-171.ec2.internal,}" Apr 16 20:57:52.629286 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.629275 2575 policy_none.go:49] "None policy: Start" Apr 16 20:57:52.629316 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.629291 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 20:57:52.629316 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.629301 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 16 20:57:52.634992 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.634968 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-clfcj" Apr 16 20:57:52.641247 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.641176 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-171.ec2.internal.18a6f1e5de7d20bb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-171.ec2.internal,UID:ip-10-0-141-171.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-141-171.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-141-171.ec2.internal,},FirstTimestamp:2026-04-16 20:57:52.626786491 +0000 UTC m=+0.416403486,LastTimestamp:2026-04-16 20:57:52.626786491 +0000 UTC m=+0.416403486,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-171.ec2.internal,}" Apr 16 20:57:52.642739 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.642725 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-clfcj" Apr 16 20:57:52.675045 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.675028 2575 manager.go:341] "Starting Device Plugin manager" Apr 16 20:57:52.685830 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.675071 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 20:57:52.685830 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.675087 2575 server.go:85] "Starting device plugin registration server" Apr 16 20:57:52.685830 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.675361 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 20:57:52.685830 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.675371 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 20:57:52.685830 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.675481 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 20:57:52.685830 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.676140 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 20:57:52.685830 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.676151 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 20:57:52.685830 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.676249 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 20:57:52.685830 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.676301 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-171.ec2.internal\" not found" Apr 16 20:57:52.712917 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.712895 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 20:57:52.714107 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.714089 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 20:57:52.714201 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.714117 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 20:57:52.714201 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.714134 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 20:57:52.714201 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.714141 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 20:57:52.714201 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.714183 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 20:57:52.716154 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.716137 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:57:52.775738 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.775682 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:57:52.777024 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.777009 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:57:52.777101 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.777036 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:57:52.777101 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.777047 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:57:52.777101 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.777069 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-171.ec2.internal" Apr 16 20:57:52.783054 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.783037 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-171.ec2.internal" Apr 16 20:57:52.783147 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.783064 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-171.ec2.internal\": node \"ip-10-0-141-171.ec2.internal\" not found" Apr 16 20:57:52.799917 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.799893 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-171.ec2.internal\" not found" Apr 16 20:57:52.814268 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.814246 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-171.ec2.internal"] Apr 16 20:57:52.814338 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.814303 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:57:52.815726 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.815705 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:57:52.815806 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.815735 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:57:52.815806 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.815745 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:57:52.817819 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.817807 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:57:52.817979 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.817966 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal" Apr 16 20:57:52.818020 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.817994 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:57:52.818894 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.818880 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:57:52.818939 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.818901 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:57:52.818939 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.818910 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:57:52.819006 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.818886 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:57:52.819006 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.818970 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:57:52.819006 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.818979 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:57:52.820986 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.820968 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-171.ec2.internal" Apr 16 20:57:52.821060 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.820999 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:57:52.822020 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.822004 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:57:52.822104 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.822035 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:57:52.822104 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.822046 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:57:52.844534 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.844512 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-171.ec2.internal\" not found" node="ip-10-0-141-171.ec2.internal" Apr 16 20:57:52.848585 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.848569 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-171.ec2.internal\" not found" node="ip-10-0-141-171.ec2.internal" Apr 16 20:57:52.900784 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:52.900750 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-171.ec2.internal\" not found" Apr 16 20:57:52.905347 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.905329 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/60c9ae910eb32de4a33e8b075589ffa2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal\" (UID: \"60c9ae910eb32de4a33e8b075589ffa2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal" Apr 16 20:57:52.905405 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.905356 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/60c9ae910eb32de4a33e8b075589ffa2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal\" (UID: \"60c9ae910eb32de4a33e8b075589ffa2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal" Apr 16 20:57:52.905405 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:52.905372 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/334c8afc5dc3a1fc4c297cab1fdb85d1-config\") pod \"kube-apiserver-proxy-ip-10-0-141-171.ec2.internal\" (UID: \"334c8afc5dc3a1fc4c297cab1fdb85d1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-171.ec2.internal" Apr 16 20:57:53.001512 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:53.001485 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-171.ec2.internal\" not found" Apr 16 20:57:53.005863 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.005835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/60c9ae910eb32de4a33e8b075589ffa2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal\" (UID: \"60c9ae910eb32de4a33e8b075589ffa2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal" Apr 16 20:57:53.005945 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.005871 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/60c9ae910eb32de4a33e8b075589ffa2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal\" (UID: \"60c9ae910eb32de4a33e8b075589ffa2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal" Apr 16 20:57:53.005945 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.005890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/334c8afc5dc3a1fc4c297cab1fdb85d1-config\") pod \"kube-apiserver-proxy-ip-10-0-141-171.ec2.internal\" (UID: \"334c8afc5dc3a1fc4c297cab1fdb85d1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-171.ec2.internal" Apr 16 20:57:53.006047 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.005941 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/60c9ae910eb32de4a33e8b075589ffa2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal\" (UID: \"60c9ae910eb32de4a33e8b075589ffa2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal" Apr 16 20:57:53.006047 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.005999 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/334c8afc5dc3a1fc4c297cab1fdb85d1-config\") pod \"kube-apiserver-proxy-ip-10-0-141-171.ec2.internal\" (UID: \"334c8afc5dc3a1fc4c297cab1fdb85d1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-171.ec2.internal" Apr 16 20:57:53.006047 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.006021 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/60c9ae910eb32de4a33e8b075589ffa2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal\" (UID: \"60c9ae910eb32de4a33e8b075589ffa2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal" Apr 16 20:57:53.102283 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:53.102221 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-171.ec2.internal\" not found" Apr 16 20:57:53.146712 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.146668 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal" Apr 16 20:57:53.150082 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.150059 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-171.ec2.internal" Apr 16 20:57:53.202660 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:53.202628 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-171.ec2.internal\" not found" Apr 16 20:57:53.303076 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:53.303042 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-171.ec2.internal\" not found" Apr 16 20:57:53.403545 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:53.403476 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-171.ec2.internal\" not found" Apr 16 20:57:53.503876 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:53.503841 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-171.ec2.internal\" not found" Apr 16 20:57:53.524165 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.524134 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 20:57:53.524292 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.524276 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:57:53.603476 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.603457 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 20:57:53.603946 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:53.603926 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-171.ec2.internal\" not found" Apr 16 20:57:53.616208 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.616185 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:57:53.618917 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.618891 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:57:53.643986 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.643945 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 20:52:52 +0000 UTC" deadline="2027-11-07 07:40:38.236343874 +0000 UTC" Apr 16 20:57:53.643986 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.643978 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13666h42m44.592368976s" Apr 16 20:57:53.644400 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.644383 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2rjlf" Apr 16 20:57:53.655049 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.654984 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2rjlf" Apr 16 20:57:53.704815 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:53.704785 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-171.ec2.internal\" not found" Apr 16 20:57:53.725186 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:53.725153 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60c9ae910eb32de4a33e8b075589ffa2.slice/crio-e12e564d4e1cca469bc5751c8ccd0311d7592e9f4d6a3b178f7bcd77cfecc7a6 WatchSource:0}: Error finding container e12e564d4e1cca469bc5751c8ccd0311d7592e9f4d6a3b178f7bcd77cfecc7a6: Status 404 returned error can't find the container with id e12e564d4e1cca469bc5751c8ccd0311d7592e9f4d6a3b178f7bcd77cfecc7a6 Apr 16 20:57:53.725416 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:53.725396 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod334c8afc5dc3a1fc4c297cab1fdb85d1.slice/crio-1a0ed28cc3d5daa9170298609c013ad3e57362e376e41f7981ef41d0f96ab66f WatchSource:0}: Error finding container 1a0ed28cc3d5daa9170298609c013ad3e57362e376e41f7981ef41d0f96ab66f: Status 404 returned error can't find the container with id 1a0ed28cc3d5daa9170298609c013ad3e57362e376e41f7981ef41d0f96ab66f Apr 16 20:57:53.729838 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.729825 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:57:53.790582 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.790549 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:57:53.803795 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.803774 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal" Apr 16 20:57:53.818207 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.818183 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:57:53.819680 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.819666 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-171.ec2.internal" Apr 16 20:57:53.826538 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.826524 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:57:53.907101 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:53.907023 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:57:54.585344 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.585318 2575 apiserver.go:52] "Watching apiserver" Apr 16 20:57:54.595803 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.595776 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 20:57:54.596250 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.596225 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-hhb77","kube-system/konnectivity-agent-q97vd","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x","openshift-cluster-node-tuning-operator/tuned-tnqpt","openshift-dns/node-resolver-tzlx6","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal","openshift-multus/multus-additional-cni-plugins-8jcqc","openshift-multus/multus-bbf6s","openshift-ovn-kubernetes/ovnkube-node-tzbn8","kube-system/kube-apiserver-proxy-ip-10-0-141-171.ec2.internal","openshift-image-registry/node-ca-n7hjd","openshift-multus/network-metrics-daemon-zbj49","openshift-network-diagnostics/network-check-target-nngg9"] Apr 16 20:57:54.599391 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.599363 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.602737 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.602420 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-446tz\"" Apr 16 20:57:54.602737 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.602495 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 20:57:54.602737 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.602552 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 20:57:54.602737 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.602583 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 20:57:54.603001 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.602752 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 20:57:54.603054 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.603016 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 20:57:54.604084 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.603788 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.604084 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.603906 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tzlx6" Apr 16 20:57:54.606059 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.606025 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hhb77" Apr 16 20:57:54.606313 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.606287 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-xrk5p\"" Apr 16 20:57:54.606516 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.606499 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:57:54.606739 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.606717 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 20:57:54.606739 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.606733 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 20:57:54.607263 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.607241 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 20:57:54.607942 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.607796 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-48sqq\"" Apr 16 20:57:54.609524 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.608827 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:57:54.609524 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.608960 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-s6qxs\"" Apr 16 20:57:54.609524 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.609254 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 20:57:54.609524 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.609343 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 20:57:54.612170 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.611991 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.614061 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.613949 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6128194-41ca-4dbf-a538-c346ec94bd50-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.614061 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.613993 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-sysctl-d\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.614061 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614041 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-var-lib-kubelet\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.614265 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614074 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-host\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.614265 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614102 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92c3499b-a30d-4470-8b98-4e3a3b91de06-tmp-dir\") pod \"node-resolver-tzlx6\" (UID: \"92c3499b-a30d-4470-8b98-4e3a3b91de06\") " pod="openshift-dns/node-resolver-tzlx6" Apr 16 20:57:54.614265 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614127 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c6128194-41ca-4dbf-a538-c346ec94bd50-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.614265 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614170 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-systemd\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.614265 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-run\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.614265 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614223 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-lib-modules\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.614265 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614248 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgdbq\" (UniqueName: \"kubernetes.io/projected/92c3499b-a30d-4470-8b98-4e3a3b91de06-kube-api-access-rgdbq\") pod \"node-resolver-tzlx6\" (UID: \"92c3499b-a30d-4470-8b98-4e3a3b91de06\") " pod="openshift-dns/node-resolver-tzlx6" Apr 16 20:57:54.614617 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6128194-41ca-4dbf-a538-c346ec94bd50-os-release\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.614617 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614297 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6128194-41ca-4dbf-a538-c346ec94bd50-cni-binary-copy\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.614617 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614328 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.614617 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614336 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-sysconfig\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.614617 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614361 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-kubernetes\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.614617 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614383 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-sys\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.614617 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614408 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8wqk\" (UniqueName: \"kubernetes.io/projected/3f9f057c-6ace-415c-94e4-339204749514-kube-api-access-w8wqk\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.614617 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614431 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c6128194-41ca-4dbf-a538-c346ec94bd50-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.614617 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdh8s\" (UniqueName: \"kubernetes.io/projected/c6128194-41ca-4dbf-a538-c346ec94bd50-kube-api-access-qdh8s\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.614617 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614496 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-modprobe-d\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.614617 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614522 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-sysctl-conf\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.614617 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3f9f057c-6ace-415c-94e4-339204749514-etc-tuned\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.614617 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614566 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3f9f057c-6ace-415c-94e4-339204749514-tmp\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.614617 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614588 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92c3499b-a30d-4470-8b98-4e3a3b91de06-hosts-file\") pod \"node-resolver-tzlx6\" (UID: \"92c3499b-a30d-4470-8b98-4e3a3b91de06\") " pod="openshift-dns/node-resolver-tzlx6" Apr 16 20:57:54.614617 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614617 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6128194-41ca-4dbf-a538-c346ec94bd50-system-cni-dir\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.615286 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.614641 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6128194-41ca-4dbf-a538-c346ec94bd50-cnibin\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.615286 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.615028 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-7lsts\"" Apr 16 20:57:54.615286 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.615242 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 20:57:54.616977 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.616958 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n7hjd" Apr 16 20:57:54.618948 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.618298 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qbzjk\"" Apr 16 20:57:54.618948 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.618302 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 20:57:54.618948 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.618335 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 20:57:54.618948 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.618732 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 20:57:54.618948 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.618861 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 20:57:54.618948 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.618913 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 20:57:54.619240 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.618740 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 20:57:54.619294 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.619280 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:57:54.619400 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:54.619353 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbj49" podUID="85e4090a-8dbc-4412-b20a-23d79d838363" Apr 16 20:57:54.619801 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.619750 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 20:57:54.619801 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.619750 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-d5c29\"" Apr 16 20:57:54.619955 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.619755 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 20:57:54.620114 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.620096 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 20:57:54.621679 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.621641 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:57:54.621772 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:54.621710 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nngg9" podUID="a8b58c12-2972-4947-9fea-b3f94f82f207" Apr 16 20:57:54.623953 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.623936 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q97vd" Apr 16 20:57:54.626534 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.626227 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.627104 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.626934 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 20:57:54.627172 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.627130 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 20:57:54.627912 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.627895 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-cd57d\"" Apr 16 20:57:54.628735 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.628714 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 20:57:54.628838 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.628795 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 20:57:54.629056 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.629029 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 20:57:54.629835 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.629616 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-947t8\"" Apr 16 20:57:54.656261 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.656232 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:52:53 +0000 UTC" deadline="2027-10-14 22:14:17.24883288 +0000 UTC" Apr 16 20:57:54.656261 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.656260 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13105h16m22.592578005s" Apr 16 20:57:54.705318 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.705291 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 20:57:54.715925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.715767 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c6128194-41ca-4dbf-a538-c346ec94bd50-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.715925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.715823 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-run-netns\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.715925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.715859 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-sys-fs\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.715925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.715883 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-run-k8s-cni-cncf-io\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.716224 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.715941 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-run-netns\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.716224 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716003 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-multus-conf-dir\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.716224 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716040 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgdbq\" (UniqueName: \"kubernetes.io/projected/92c3499b-a30d-4470-8b98-4e3a3b91de06-kube-api-access-rgdbq\") pod \"node-resolver-tzlx6\" (UID: \"92c3499b-a30d-4470-8b98-4e3a3b91de06\") " pod="openshift-dns/node-resolver-tzlx6" Apr 16 20:57:54.716224 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716077 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-multus-daemon-config\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.716224 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716199 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-sysconfig\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.716478 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716230 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-cni-netd\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.716478 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716281 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/63a48313-4beb-4c3e-89bb-bddcb5b76d5a-serviceca\") pod \"node-ca-n7hjd\" (UID: \"63a48313-4beb-4c3e-89bb-bddcb5b76d5a\") " pod="openshift-image-registry/node-ca-n7hjd" Apr 16 20:57:54.716478 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716303 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-cni-binary-copy\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.716478 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716326 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3f9f057c-6ace-415c-94e4-339204749514-tmp\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.716478 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716343 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c6128194-41ca-4dbf-a538-c346ec94bd50-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.716478 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716349 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6128194-41ca-4dbf-a538-c346ec94bd50-system-cni-dir\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.716478 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716462 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-sysconfig\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.716478 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716471 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6128194-41ca-4dbf-a538-c346ec94bd50-system-cni-dir\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.716843 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716486 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6128194-41ca-4dbf-a538-c346ec94bd50-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.716843 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716588 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-run-ovn\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.716843 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716648 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-etc-selinux\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.716843 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716664 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6128194-41ca-4dbf-a538-c346ec94bd50-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.716843 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716668 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kqvt\" (UniqueName: \"kubernetes.io/projected/c364ce0e-4d4b-44b6-957a-88b603b24167-kube-api-access-6kqvt\") pod \"iptables-alerter-hhb77\" (UID: \"c364ce0e-4d4b-44b6-957a-88b603b24167\") " pod="openshift-network-operator/iptables-alerter-hhb77" Apr 16 20:57:54.716843 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716724 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-run-multus-certs\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.716843 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716745 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 20:57:54.716843 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716764 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-run-openvswitch\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.716843 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716812 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ffb608ad-9008-418c-a258-d80cf876c140-env-overrides\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.716843 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ffb608ad-9008-418c-a258-d80cf876c140-ovn-node-metrics-cert\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.717263 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716857 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c364ce0e-4d4b-44b6-957a-88b603b24167-host-slash\") pod \"iptables-alerter-hhb77\" (UID: \"c364ce0e-4d4b-44b6-957a-88b603b24167\") " pod="openshift-network-operator/iptables-alerter-hhb77" Apr 16 20:57:54.717263 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716878 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-cnibin\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.717263 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716898 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-var-lib-kubelet\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.717263 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716921 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.717263 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716944 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mwmw\" (UniqueName: \"kubernetes.io/projected/85e4090a-8dbc-4412-b20a-23d79d838363-kube-api-access-2mwmw\") pod \"network-metrics-daemon-zbj49\" (UID: \"85e4090a-8dbc-4412-b20a-23d79d838363\") " pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:57:54.717263 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-system-cni-dir\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.717263 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.716996 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-multus-socket-dir-parent\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.717263 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8wqk\" (UniqueName: \"kubernetes.io/projected/3f9f057c-6ace-415c-94e4-339204749514-kube-api-access-w8wqk\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.717263 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717047 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdh8s\" (UniqueName: \"kubernetes.io/projected/c6128194-41ca-4dbf-a538-c346ec94bd50-kube-api-access-qdh8s\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.717263 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717075 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-var-lib-openvswitch\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.717263 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717100 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ffb608ad-9008-418c-a258-d80cf876c140-ovnkube-script-lib\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.717263 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb2xn\" (UniqueName: \"kubernetes.io/projected/ffb608ad-9008-418c-a258-d80cf876c140-kube-api-access-tb2xn\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.717263 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717220 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-sysctl-conf\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.717263 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3f9f057c-6ace-415c-94e4-339204749514-etc-tuned\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.717263 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717272 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92c3499b-a30d-4470-8b98-4e3a3b91de06-hosts-file\") pod \"node-resolver-tzlx6\" (UID: \"92c3499b-a30d-4470-8b98-4e3a3b91de06\") " pod="openshift-dns/node-resolver-tzlx6" Apr 16 20:57:54.717949 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717299 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6128194-41ca-4dbf-a538-c346ec94bd50-cnibin\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.717949 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717352 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6128194-41ca-4dbf-a538-c346ec94bd50-cnibin\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.717949 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717425 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-sysctl-conf\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.717949 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717490 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-run-systemd\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.717949 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717527 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-multus-cni-dir\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.717949 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717554 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-var-lib-cni-multus\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.717949 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-sysctl-d\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.717949 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717613 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-var-lib-kubelet\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.717949 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717639 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-kubelet\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.717949 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717664 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-socket-dir\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.717949 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717688 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-registration-dir\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.717949 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgsxh\" (UniqueName: \"kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh\") pod \"network-check-target-nngg9\" (UID: \"a8b58c12-2972-4947-9fea-b3f94f82f207\") " pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:57:54.717949 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717743 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-run\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.717949 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717768 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-lib-modules\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.717949 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6128194-41ca-4dbf-a538-c346ec94bd50-os-release\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.717949 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-node-log\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.717949 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717848 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ffb608ad-9008-418c-a258-d80cf876c140-ovnkube-config\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4x56\" (UniqueName: \"kubernetes.io/projected/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-kube-api-access-s4x56\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.717951 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-run\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718017 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92c3499b-a30d-4470-8b98-4e3a3b91de06-hosts-file\") pod \"node-resolver-tzlx6\" (UID: \"92c3499b-a30d-4470-8b98-4e3a3b91de06\") " pod="openshift-dns/node-resolver-tzlx6" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-sysctl-d\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-var-lib-kubelet\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718084 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6128194-41ca-4dbf-a538-c346ec94bd50-os-release\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-kubernetes\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718122 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-lib-modules\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718140 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63a48313-4beb-4c3e-89bb-bddcb5b76d5a-host\") pod \"node-ca-n7hjd\" (UID: \"63a48313-4beb-4c3e-89bb-bddcb5b76d5a\") " pod="openshift-image-registry/node-ca-n7hjd" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718186 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-var-lib-cni-bin\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718220 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-kubernetes\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718236 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-modprobe-d\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718273 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-log-socket\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718311 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-hostroot\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718349 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-modprobe-d\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718355 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-host\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718386 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92c3499b-a30d-4470-8b98-4e3a3b91de06-tmp-dir\") pod \"node-resolver-tzlx6\" (UID: \"92c3499b-a30d-4470-8b98-4e3a3b91de06\") " pod="openshift-dns/node-resolver-tzlx6" Apr 16 20:57:54.718703 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718482 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-host\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.719576 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718506 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-slash\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.719576 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718538 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-etc-openvswitch\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.719576 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718564 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-kubelet-dir\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.719576 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718588 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-device-dir\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.719576 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718613 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-etc-kubernetes\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.719576 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718640 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-systemd\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.719576 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6128194-41ca-4dbf-a538-c346ec94bd50-cni-binary-copy\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.719576 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718678 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92c3499b-a30d-4470-8b98-4e3a3b91de06-tmp-dir\") pod \"node-resolver-tzlx6\" (UID: \"92c3499b-a30d-4470-8b98-4e3a3b91de06\") " pod="openshift-dns/node-resolver-tzlx6" Apr 16 20:57:54.719576 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718692 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5m5w\" (UniqueName: \"kubernetes.io/projected/63a48313-4beb-4c3e-89bb-bddcb5b76d5a-kube-api-access-q5m5w\") pod \"node-ca-n7hjd\" (UID: \"63a48313-4beb-4c3e-89bb-bddcb5b76d5a\") " pod="openshift-image-registry/node-ca-n7hjd" Apr 16 20:57:54.719576 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718719 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jr7q\" (UniqueName: \"kubernetes.io/projected/a9785ef4-4aae-480f-9acc-3797a9cc8b98-kube-api-access-9jr7q\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.719576 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718743 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-sys\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.719576 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718739 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-etc-systemd\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.719576 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c6128194-41ca-4dbf-a538-c346ec94bd50-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.719576 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718800 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-systemd-units\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.719576 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718885 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f9f057c-6ace-415c-94e4-339204749514-sys\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.719576 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718922 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-cni-bin\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.719576 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718946 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/655d4da6-65a0-4fa0-98ef-e68070f32130-konnectivity-ca\") pod \"konnectivity-agent-q97vd\" (UID: \"655d4da6-65a0-4fa0-98ef-e68070f32130\") " pod="kube-system/konnectivity-agent-q97vd" Apr 16 20:57:54.720213 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718973 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c364ce0e-4d4b-44b6-957a-88b603b24167-iptables-alerter-script\") pod \"iptables-alerter-hhb77\" (UID: \"c364ce0e-4d4b-44b6-957a-88b603b24167\") " pod="openshift-network-operator/iptables-alerter-hhb77" Apr 16 20:57:54.720213 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.718998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-run-ovn-kubernetes\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.720213 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.719025 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs\") pod \"network-metrics-daemon-zbj49\" (UID: \"85e4090a-8dbc-4412-b20a-23d79d838363\") " pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:57:54.720213 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.719049 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/655d4da6-65a0-4fa0-98ef-e68070f32130-agent-certs\") pod \"konnectivity-agent-q97vd\" (UID: \"655d4da6-65a0-4fa0-98ef-e68070f32130\") " pod="kube-system/konnectivity-agent-q97vd" Apr 16 20:57:54.720213 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.719110 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-os-release\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.720213 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.719197 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6128194-41ca-4dbf-a538-c346ec94bd50-cni-binary-copy\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.720213 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.719239 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c6128194-41ca-4dbf-a538-c346ec94bd50-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.720213 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.719702 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal" event={"ID":"60c9ae910eb32de4a33e8b075589ffa2","Type":"ContainerStarted","Data":"e12e564d4e1cca469bc5751c8ccd0311d7592e9f4d6a3b178f7bcd77cfecc7a6"} Apr 16 20:57:54.720637 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.720617 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3f9f057c-6ace-415c-94e4-339204749514-tmp\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.720714 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.720695 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3f9f057c-6ace-415c-94e4-339204749514-etc-tuned\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.721036 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.720996 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-171.ec2.internal" event={"ID":"334c8afc5dc3a1fc4c297cab1fdb85d1","Type":"ContainerStarted","Data":"1a0ed28cc3d5daa9170298609c013ad3e57362e376e41f7981ef41d0f96ab66f"} Apr 16 20:57:54.729429 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.729405 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8wqk\" (UniqueName: \"kubernetes.io/projected/3f9f057c-6ace-415c-94e4-339204749514-kube-api-access-w8wqk\") pod \"tuned-tnqpt\" (UID: \"3f9f057c-6ace-415c-94e4-339204749514\") " pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.729429 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.729424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgdbq\" (UniqueName: \"kubernetes.io/projected/92c3499b-a30d-4470-8b98-4e3a3b91de06-kube-api-access-rgdbq\") pod \"node-resolver-tzlx6\" (UID: \"92c3499b-a30d-4470-8b98-4e3a3b91de06\") " pod="openshift-dns/node-resolver-tzlx6" Apr 16 20:57:54.729929 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.729904 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdh8s\" (UniqueName: \"kubernetes.io/projected/c6128194-41ca-4dbf-a538-c346ec94bd50-kube-api-access-qdh8s\") pod \"multus-additional-cni-plugins-8jcqc\" (UID: \"c6128194-41ca-4dbf-a538-c346ec94bd50\") " pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.819925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.819847 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-log-socket\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.819925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.819890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-hostroot\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.819925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.819917 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-slash\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.820187 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.819940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-etc-openvswitch\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.820187 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.819949 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-hostroot\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.820187 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.819963 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-kubelet-dir\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.820187 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.819949 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-log-socket\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.820187 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.819986 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-device-dir\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.820187 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.819994 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-slash\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.820187 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820006 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-etc-openvswitch\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.820187 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820043 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-device-dir\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.820187 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820043 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-etc-kubernetes\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.820187 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820067 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-kubelet-dir\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.820187 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820075 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-etc-kubernetes\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.820187 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820084 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5m5w\" (UniqueName: \"kubernetes.io/projected/63a48313-4beb-4c3e-89bb-bddcb5b76d5a-kube-api-access-q5m5w\") pod \"node-ca-n7hjd\" (UID: \"63a48313-4beb-4c3e-89bb-bddcb5b76d5a\") " pod="openshift-image-registry/node-ca-n7hjd" Apr 16 20:57:54.820187 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jr7q\" (UniqueName: \"kubernetes.io/projected/a9785ef4-4aae-480f-9acc-3797a9cc8b98-kube-api-access-9jr7q\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.820187 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-systemd-units\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.820187 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820164 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-cni-bin\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.820187 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/655d4da6-65a0-4fa0-98ef-e68070f32130-konnectivity-ca\") pod \"konnectivity-agent-q97vd\" (UID: \"655d4da6-65a0-4fa0-98ef-e68070f32130\") " pod="kube-system/konnectivity-agent-q97vd" Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820196 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-systemd-units\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c364ce0e-4d4b-44b6-957a-88b603b24167-iptables-alerter-script\") pod \"iptables-alerter-hhb77\" (UID: \"c364ce0e-4d4b-44b6-957a-88b603b24167\") " pod="openshift-network-operator/iptables-alerter-hhb77" Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820253 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-cni-bin\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820282 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-run-ovn-kubernetes\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820315 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs\") pod \"network-metrics-daemon-zbj49\" (UID: \"85e4090a-8dbc-4412-b20a-23d79d838363\") " pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820341 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/655d4da6-65a0-4fa0-98ef-e68070f32130-agent-certs\") pod \"konnectivity-agent-q97vd\" (UID: \"655d4da6-65a0-4fa0-98ef-e68070f32130\") " pod="kube-system/konnectivity-agent-q97vd" Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820365 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-os-release\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820392 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-run-netns\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820410 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-run-ovn-kubernetes\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-sys-fs\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820463 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-os-release\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-run-k8s-cni-cncf-io\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820484 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-sys-fs\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820501 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-run-netns\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-multus-conf-dir\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-multus-conf-dir\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:54.820567 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:57:54.820925 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820598 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-run-netns\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.821717 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820502 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-run-k8s-cni-cncf-io\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.821717 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:54.820626 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs podName:85e4090a-8dbc-4412-b20a-23d79d838363 nodeName:}" failed. No retries permitted until 2026-04-16 20:57:55.320607541 +0000 UTC m=+3.110224530 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs") pod "network-metrics-daemon-zbj49" (UID: "85e4090a-8dbc-4412-b20a-23d79d838363") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:57:54.821717 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820633 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-run-netns\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.821717 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820657 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-multus-daemon-config\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.821717 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820683 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-cni-netd\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.821717 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820703 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/63a48313-4beb-4c3e-89bb-bddcb5b76d5a-serviceca\") pod \"node-ca-n7hjd\" (UID: \"63a48313-4beb-4c3e-89bb-bddcb5b76d5a\") " pod="openshift-image-registry/node-ca-n7hjd" Apr 16 20:57:54.821717 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820722 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-cni-binary-copy\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.821717 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820749 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-run-ovn\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.821717 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-etc-selinux\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.821717 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820799 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kqvt\" (UniqueName: \"kubernetes.io/projected/c364ce0e-4d4b-44b6-957a-88b603b24167-kube-api-access-6kqvt\") pod \"iptables-alerter-hhb77\" (UID: \"c364ce0e-4d4b-44b6-957a-88b603b24167\") " pod="openshift-network-operator/iptables-alerter-hhb77" Apr 16 20:57:54.821717 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820765 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-cni-netd\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.821717 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820821 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c364ce0e-4d4b-44b6-957a-88b603b24167-iptables-alerter-script\") pod \"iptables-alerter-hhb77\" (UID: \"c364ce0e-4d4b-44b6-957a-88b603b24167\") " pod="openshift-network-operator/iptables-alerter-hhb77" Apr 16 20:57:54.821717 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820828 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-run-multus-certs\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.821717 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820860 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-run-openvswitch\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.821717 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820823 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-run-ovn\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.821717 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820864 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-etc-selinux\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.821717 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820886 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-run-multus-certs\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.822464 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.820916 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-run-openvswitch\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.822464 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ffb608ad-9008-418c-a258-d80cf876c140-env-overrides\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.822464 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ffb608ad-9008-418c-a258-d80cf876c140-ovn-node-metrics-cert\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.822464 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c364ce0e-4d4b-44b6-957a-88b603b24167-host-slash\") pod \"iptables-alerter-hhb77\" (UID: \"c364ce0e-4d4b-44b6-957a-88b603b24167\") " pod="openshift-network-operator/iptables-alerter-hhb77" Apr 16 20:57:54.822464 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821198 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-cnibin\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.822464 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821225 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-var-lib-kubelet\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.822464 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821251 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.822464 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821266 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/63a48313-4beb-4c3e-89bb-bddcb5b76d5a-serviceca\") pod \"node-ca-n7hjd\" (UID: \"63a48313-4beb-4c3e-89bb-bddcb5b76d5a\") " pod="openshift-image-registry/node-ca-n7hjd" Apr 16 20:57:54.822464 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821283 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mwmw\" (UniqueName: \"kubernetes.io/projected/85e4090a-8dbc-4412-b20a-23d79d838363-kube-api-access-2mwmw\") pod \"network-metrics-daemon-zbj49\" (UID: \"85e4090a-8dbc-4412-b20a-23d79d838363\") " pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:57:54.822464 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821295 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-multus-daemon-config\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.822464 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821317 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-system-cni-dir\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.822464 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821347 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-multus-socket-dir-parent\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.822464 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821396 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-var-lib-openvswitch\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.822464 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821401 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-var-lib-kubelet\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.822464 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821422 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ffb608ad-9008-418c-a258-d80cf876c140-ovnkube-script-lib\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.822464 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821458 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.822464 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tb2xn\" (UniqueName: \"kubernetes.io/projected/ffb608ad-9008-418c-a258-d80cf876c140-kube-api-access-tb2xn\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821472 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-cni-binary-copy\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-cnibin\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-run-systemd\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-multus-cni-dir\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821560 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-system-cni-dir\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821570 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-var-lib-openvswitch\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-var-lib-cni-multus\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821608 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-multus-socket-dir-parent\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821617 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-kubelet\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821643 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-socket-dir\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821651 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ffb608ad-9008-418c-a258-d80cf876c140-env-overrides\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821668 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-multus-cni-dir\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821607 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-var-lib-cni-multus\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821507 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/655d4da6-65a0-4fa0-98ef-e68070f32130-konnectivity-ca\") pod \"konnectivity-agent-q97vd\" (UID: \"655d4da6-65a0-4fa0-98ef-e68070f32130\") " pod="kube-system/konnectivity-agent-q97vd" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821681 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-registration-dir\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821689 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-host-kubelet\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821715 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgsxh\" (UniqueName: \"kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh\") pod \"network-check-target-nngg9\" (UID: \"a8b58c12-2972-4947-9fea-b3f94f82f207\") " pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:57:54.823134 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-registration-dir\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.823720 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821744 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-node-log\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.823720 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821767 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a9785ef4-4aae-480f-9acc-3797a9cc8b98-socket-dir\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.823720 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821771 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ffb608ad-9008-418c-a258-d80cf876c140-ovnkube-config\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.823720 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821795 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-node-log\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.823720 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4x56\" (UniqueName: \"kubernetes.io/projected/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-kube-api-access-s4x56\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.823720 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ffb608ad-9008-418c-a258-d80cf876c140-run-systemd\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.823720 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63a48313-4beb-4c3e-89bb-bddcb5b76d5a-host\") pod \"node-ca-n7hjd\" (UID: \"63a48313-4beb-4c3e-89bb-bddcb5b76d5a\") " pod="openshift-image-registry/node-ca-n7hjd" Apr 16 20:57:54.823720 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821858 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-var-lib-cni-bin\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.823720 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821917 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-host-var-lib-cni-bin\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.823720 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.821955 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63a48313-4beb-4c3e-89bb-bddcb5b76d5a-host\") pod \"node-ca-n7hjd\" (UID: \"63a48313-4beb-4c3e-89bb-bddcb5b76d5a\") " pod="openshift-image-registry/node-ca-n7hjd" Apr 16 20:57:54.823720 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.822040 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ffb608ad-9008-418c-a258-d80cf876c140-ovnkube-script-lib\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.823720 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.822070 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c364ce0e-4d4b-44b6-957a-88b603b24167-host-slash\") pod \"iptables-alerter-hhb77\" (UID: \"c364ce0e-4d4b-44b6-957a-88b603b24167\") " pod="openshift-network-operator/iptables-alerter-hhb77" Apr 16 20:57:54.823720 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.822186 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ffb608ad-9008-418c-a258-d80cf876c140-ovnkube-config\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.823720 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.823149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/655d4da6-65a0-4fa0-98ef-e68070f32130-agent-certs\") pod \"konnectivity-agent-q97vd\" (UID: \"655d4da6-65a0-4fa0-98ef-e68070f32130\") " pod="kube-system/konnectivity-agent-q97vd" Apr 16 20:57:54.824173 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.823760 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ffb608ad-9008-418c-a258-d80cf876c140-ovn-node-metrics-cert\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.840959 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.840933 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mwmw\" (UniqueName: \"kubernetes.io/projected/85e4090a-8dbc-4412-b20a-23d79d838363-kube-api-access-2mwmw\") pod \"network-metrics-daemon-zbj49\" (UID: \"85e4090a-8dbc-4412-b20a-23d79d838363\") " pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:57:54.843085 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.843060 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb2xn\" (UniqueName: \"kubernetes.io/projected/ffb608ad-9008-418c-a258-d80cf876c140-kube-api-access-tb2xn\") pod \"ovnkube-node-tzbn8\" (UID: \"ffb608ad-9008-418c-a258-d80cf876c140\") " pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.843253 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.843234 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kqvt\" (UniqueName: \"kubernetes.io/projected/c364ce0e-4d4b-44b6-957a-88b603b24167-kube-api-access-6kqvt\") pod \"iptables-alerter-hhb77\" (UID: \"c364ce0e-4d4b-44b6-957a-88b603b24167\") " pod="openshift-network-operator/iptables-alerter-hhb77" Apr 16 20:57:54.846511 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:54.846487 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:57:54.846607 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:54.846515 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:57:54.846607 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:54.846531 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dgsxh for pod openshift-network-diagnostics/network-check-target-nngg9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:57:54.846732 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:54.846607 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh podName:a8b58c12-2972-4947-9fea-b3f94f82f207 nodeName:}" failed. No retries permitted until 2026-04-16 20:57:55.346589339 +0000 UTC m=+3.136206339 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dgsxh" (UniqueName: "kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh") pod "network-check-target-nngg9" (UID: "a8b58c12-2972-4947-9fea-b3f94f82f207") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:57:54.847734 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.847712 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4x56\" (UniqueName: \"kubernetes.io/projected/8d0ad4e1-5b83-4118-baa5-e8531b28ca54-kube-api-access-s4x56\") pod \"multus-bbf6s\" (UID: \"8d0ad4e1-5b83-4118-baa5-e8531b28ca54\") " pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.847941 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.847915 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jr7q\" (UniqueName: \"kubernetes.io/projected/a9785ef4-4aae-480f-9acc-3797a9cc8b98-kube-api-access-9jr7q\") pod \"aws-ebs-csi-driver-node-d6b6x\" (UID: \"a9785ef4-4aae-480f-9acc-3797a9cc8b98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:54.848052 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.847947 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5m5w\" (UniqueName: \"kubernetes.io/projected/63a48313-4beb-4c3e-89bb-bddcb5b76d5a-kube-api-access-q5m5w\") pod \"node-ca-n7hjd\" (UID: \"63a48313-4beb-4c3e-89bb-bddcb5b76d5a\") " pod="openshift-image-registry/node-ca-n7hjd" Apr 16 20:57:54.914981 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.914944 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8jcqc" Apr 16 20:57:54.921937 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.921896 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" Apr 16 20:57:54.931902 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.931876 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tzlx6" Apr 16 20:57:54.938336 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.938315 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hhb77" Apr 16 20:57:54.945921 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.945899 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bbf6s" Apr 16 20:57:54.953392 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.953374 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:57:54.960916 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.960896 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n7hjd" Apr 16 20:57:54.967466 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.967430 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q97vd" Apr 16 20:57:54.974025 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:54.973992 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" Apr 16 20:57:55.069663 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:55.069631 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:57:55.325952 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:55.325911 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs\") pod \"network-metrics-daemon-zbj49\" (UID: \"85e4090a-8dbc-4412-b20a-23d79d838363\") " pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:57:55.326104 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:55.326052 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:57:55.326181 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:55.326131 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs podName:85e4090a-8dbc-4412-b20a-23d79d838363 nodeName:}" failed. No retries permitted until 2026-04-16 20:57:56.326110148 +0000 UTC m=+4.115727138 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs") pod "network-metrics-daemon-zbj49" (UID: "85e4090a-8dbc-4412-b20a-23d79d838363") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:57:55.426389 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:55.426348 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgsxh\" (UniqueName: \"kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh\") pod \"network-check-target-nngg9\" (UID: \"a8b58c12-2972-4947-9fea-b3f94f82f207\") " pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:57:55.426565 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:55.426528 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:57:55.426565 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:55.426548 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:57:55.426565 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:55.426558 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dgsxh for pod openshift-network-diagnostics/network-check-target-nngg9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:57:55.426737 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:55.426609 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh podName:a8b58c12-2972-4947-9fea-b3f94f82f207 nodeName:}" failed. No retries permitted until 2026-04-16 20:57:56.426594621 +0000 UTC m=+4.216211604 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dgsxh" (UniqueName: "kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh") pod "network-check-target-nngg9" (UID: "a8b58c12-2972-4947-9fea-b3f94f82f207") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:57:55.480738 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:55.480698 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f9f057c_6ace_415c_94e4_339204749514.slice/crio-5b331c1ff6805e07a0283b56c9aa571499f4ee8a70330b9049e13d754623b929 WatchSource:0}: Error finding container 5b331c1ff6805e07a0283b56c9aa571499f4ee8a70330b9049e13d754623b929: Status 404 returned error can't find the container with id 5b331c1ff6805e07a0283b56c9aa571499f4ee8a70330b9049e13d754623b929 Apr 16 20:57:55.486593 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:55.486565 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc364ce0e_4d4b_44b6_957a_88b603b24167.slice/crio-61623ea827e952f40f34b5f7cadde03a758e5cc20465fd44ef7b1919f0c42cfb WatchSource:0}: Error finding container 61623ea827e952f40f34b5f7cadde03a758e5cc20465fd44ef7b1919f0c42cfb: Status 404 returned error can't find the container with id 61623ea827e952f40f34b5f7cadde03a758e5cc20465fd44ef7b1919f0c42cfb Apr 16 20:57:55.488220 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:55.488106 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92c3499b_a30d_4470_8b98_4e3a3b91de06.slice/crio-02bfe28f9437d7c764b428e3fc026d640ef518af21d6ec8e7b80478c8f8de8bb WatchSource:0}: Error finding container 02bfe28f9437d7c764b428e3fc026d640ef518af21d6ec8e7b80478c8f8de8bb: Status 404 returned error can't find the container with id 02bfe28f9437d7c764b428e3fc026d640ef518af21d6ec8e7b80478c8f8de8bb Apr 16 20:57:55.489924 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:55.489895 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6128194_41ca_4dbf_a538_c346ec94bd50.slice/crio-4a4e08cd57221ecbe3c052dae7cef7c4938aae194ff5eace652d6b325a0d502c WatchSource:0}: Error finding container 4a4e08cd57221ecbe3c052dae7cef7c4938aae194ff5eace652d6b325a0d502c: Status 404 returned error can't find the container with id 4a4e08cd57221ecbe3c052dae7cef7c4938aae194ff5eace652d6b325a0d502c Apr 16 20:57:55.491707 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:55.491636 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d0ad4e1_5b83_4118_baa5_e8531b28ca54.slice/crio-ca242b67883726512d02dffa5c9853b837a56c16d4bb68d9bfdec696820c8b07 WatchSource:0}: Error finding container ca242b67883726512d02dffa5c9853b837a56c16d4bb68d9bfdec696820c8b07: Status 404 returned error can't find the container with id ca242b67883726512d02dffa5c9853b837a56c16d4bb68d9bfdec696820c8b07 Apr 16 20:57:55.493268 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:55.493246 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63a48313_4beb_4c3e_89bb_bddcb5b76d5a.slice/crio-86a5bd254f244af49ab066d0eb7299a2a9353f4c534b3f9b18727a858e5af27b WatchSource:0}: Error finding container 86a5bd254f244af49ab066d0eb7299a2a9353f4c534b3f9b18727a858e5af27b: Status 404 returned error can't find the container with id 86a5bd254f244af49ab066d0eb7299a2a9353f4c534b3f9b18727a858e5af27b Apr 16 20:57:55.493937 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:55.493912 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffb608ad_9008_418c_a258_d80cf876c140.slice/crio-fd5809716388233650ec09507d9537b452b5956500d299e88b8f9de151a405bb WatchSource:0}: Error finding container fd5809716388233650ec09507d9537b452b5956500d299e88b8f9de151a405bb: Status 404 returned error can't find the container with id fd5809716388233650ec09507d9537b452b5956500d299e88b8f9de151a405bb Apr 16 20:57:55.494915 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:55.494827 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod655d4da6_65a0_4fa0_98ef_e68070f32130.slice/crio-ed127cdccecdbe2c31d99d05ecfe94cc881b2d57229e1d56caaa775f9a44294b WatchSource:0}: Error finding container ed127cdccecdbe2c31d99d05ecfe94cc881b2d57229e1d56caaa775f9a44294b: Status 404 returned error can't find the container with id ed127cdccecdbe2c31d99d05ecfe94cc881b2d57229e1d56caaa775f9a44294b Apr 16 20:57:55.516361 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:57:55.516342 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9785ef4_4aae_480f_9acc_3797a9cc8b98.slice/crio-75edd0a99773ba48b3e32b612c96d9e8aa196338ea657af198d34592cbe74c85 WatchSource:0}: Error finding container 75edd0a99773ba48b3e32b612c96d9e8aa196338ea657af198d34592cbe74c85: Status 404 returned error can't find the container with id 75edd0a99773ba48b3e32b612c96d9e8aa196338ea657af198d34592cbe74c85 Apr 16 20:57:55.656718 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:55.656678 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:52:53 +0000 UTC" deadline="2027-10-07 10:28:36.3188029 +0000 UTC" Apr 16 20:57:55.656718 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:55.656711 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12925h30m40.662094851s" Apr 16 20:57:55.715252 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:55.715222 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:57:55.715402 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:55.715329 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nngg9" podUID="a8b58c12-2972-4947-9fea-b3f94f82f207" Apr 16 20:57:55.723499 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:55.723468 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n7hjd" event={"ID":"63a48313-4beb-4c3e-89bb-bddcb5b76d5a","Type":"ContainerStarted","Data":"86a5bd254f244af49ab066d0eb7299a2a9353f4c534b3f9b18727a858e5af27b"} Apr 16 20:57:55.724375 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:55.724346 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bbf6s" event={"ID":"8d0ad4e1-5b83-4118-baa5-e8531b28ca54","Type":"ContainerStarted","Data":"ca242b67883726512d02dffa5c9853b837a56c16d4bb68d9bfdec696820c8b07"} Apr 16 20:57:55.725247 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:55.725224 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jcqc" event={"ID":"c6128194-41ca-4dbf-a538-c346ec94bd50","Type":"ContainerStarted","Data":"4a4e08cd57221ecbe3c052dae7cef7c4938aae194ff5eace652d6b325a0d502c"} Apr 16 20:57:55.726113 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:55.726091 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tzlx6" event={"ID":"92c3499b-a30d-4470-8b98-4e3a3b91de06","Type":"ContainerStarted","Data":"02bfe28f9437d7c764b428e3fc026d640ef518af21d6ec8e7b80478c8f8de8bb"} Apr 16 20:57:55.729475 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:55.729433 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hhb77" event={"ID":"c364ce0e-4d4b-44b6-957a-88b603b24167","Type":"ContainerStarted","Data":"61623ea827e952f40f34b5f7cadde03a758e5cc20465fd44ef7b1919f0c42cfb"} Apr 16 20:57:55.731275 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:55.731243 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-171.ec2.internal" event={"ID":"334c8afc5dc3a1fc4c297cab1fdb85d1","Type":"ContainerStarted","Data":"0b612cdc947f8b658e238580d32126d186e2ee0e35b2e73c3e103d98e79eb414"} Apr 16 20:57:55.732208 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:55.732187 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" event={"ID":"a9785ef4-4aae-480f-9acc-3797a9cc8b98","Type":"ContainerStarted","Data":"75edd0a99773ba48b3e32b612c96d9e8aa196338ea657af198d34592cbe74c85"} Apr 16 20:57:55.733081 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:55.733061 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q97vd" event={"ID":"655d4da6-65a0-4fa0-98ef-e68070f32130","Type":"ContainerStarted","Data":"ed127cdccecdbe2c31d99d05ecfe94cc881b2d57229e1d56caaa775f9a44294b"} Apr 16 20:57:55.734237 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:55.734217 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" event={"ID":"ffb608ad-9008-418c-a258-d80cf876c140","Type":"ContainerStarted","Data":"fd5809716388233650ec09507d9537b452b5956500d299e88b8f9de151a405bb"} Apr 16 20:57:55.735209 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:55.735186 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" event={"ID":"3f9f057c-6ace-415c-94e4-339204749514","Type":"ContainerStarted","Data":"5b331c1ff6805e07a0283b56c9aa571499f4ee8a70330b9049e13d754623b929"} Apr 16 20:57:55.747701 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:55.747654 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-171.ec2.internal" podStartSLOduration=2.747640212 podStartE2EDuration="2.747640212s" podCreationTimestamp="2026-04-16 20:57:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:57:55.747343253 +0000 UTC m=+3.536960258" watchObservedRunningTime="2026-04-16 20:57:55.747640212 +0000 UTC m=+3.537257217" Apr 16 20:57:56.334842 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:56.334803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs\") pod \"network-metrics-daemon-zbj49\" (UID: \"85e4090a-8dbc-4412-b20a-23d79d838363\") " pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:57:56.334987 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:56.334963 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:57:56.335048 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:56.335028 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs podName:85e4090a-8dbc-4412-b20a-23d79d838363 nodeName:}" failed. No retries permitted until 2026-04-16 20:57:58.335008874 +0000 UTC m=+6.124625875 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs") pod "network-metrics-daemon-zbj49" (UID: "85e4090a-8dbc-4412-b20a-23d79d838363") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:57:56.435515 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:56.435418 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgsxh\" (UniqueName: \"kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh\") pod \"network-check-target-nngg9\" (UID: \"a8b58c12-2972-4947-9fea-b3f94f82f207\") " pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:57:56.435694 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:56.435646 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:57:56.435694 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:56.435667 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:57:56.435694 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:56.435680 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dgsxh for pod openshift-network-diagnostics/network-check-target-nngg9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:57:56.435846 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:56.435739 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh podName:a8b58c12-2972-4947-9fea-b3f94f82f207 nodeName:}" failed. No retries permitted until 2026-04-16 20:57:58.43572067 +0000 UTC m=+6.225337655 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dgsxh" (UniqueName: "kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh") pod "network-check-target-nngg9" (UID: "a8b58c12-2972-4947-9fea-b3f94f82f207") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:57:56.715308 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:56.714656 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:57:56.715308 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:56.714819 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbj49" podUID="85e4090a-8dbc-4412-b20a-23d79d838363" Apr 16 20:57:56.748999 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:56.748794 2575 generic.go:358] "Generic (PLEG): container finished" podID="60c9ae910eb32de4a33e8b075589ffa2" containerID="71305ca34a84d21a37e760084257308333d09a2ced025aaa2278b017f8aff1a9" exitCode=0 Apr 16 20:57:56.748999 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:56.748945 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal" event={"ID":"60c9ae910eb32de4a33e8b075589ffa2","Type":"ContainerDied","Data":"71305ca34a84d21a37e760084257308333d09a2ced025aaa2278b017f8aff1a9"} Apr 16 20:57:57.715581 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:57.715372 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:57:57.715581 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:57.715538 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nngg9" podUID="a8b58c12-2972-4947-9fea-b3f94f82f207" Apr 16 20:57:57.772970 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:57.772930 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal" event={"ID":"60c9ae910eb32de4a33e8b075589ffa2","Type":"ContainerStarted","Data":"16ad624bd3f07fc87209d788d99416bc34d9ad1e40eb473d96e1272bcc483b9d"} Apr 16 20:57:57.788827 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:57.787905 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-171.ec2.internal" podStartSLOduration=4.787887056 podStartE2EDuration="4.787887056s" podCreationTimestamp="2026-04-16 20:57:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:57:57.787232307 +0000 UTC m=+5.576849313" watchObservedRunningTime="2026-04-16 20:57:57.787887056 +0000 UTC m=+5.577504056" Apr 16 20:57:58.350198 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:58.349619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs\") pod \"network-metrics-daemon-zbj49\" (UID: \"85e4090a-8dbc-4412-b20a-23d79d838363\") " pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:57:58.350198 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:58.349785 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:57:58.350198 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:58.349861 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs podName:85e4090a-8dbc-4412-b20a-23d79d838363 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:02.349842337 +0000 UTC m=+10.139459320 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs") pod "network-metrics-daemon-zbj49" (UID: "85e4090a-8dbc-4412-b20a-23d79d838363") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:57:58.451223 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:58.450575 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgsxh\" (UniqueName: \"kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh\") pod \"network-check-target-nngg9\" (UID: \"a8b58c12-2972-4947-9fea-b3f94f82f207\") " pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:57:58.451223 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:58.450776 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:57:58.451223 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:58.450796 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:57:58.451223 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:58.450808 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dgsxh for pod openshift-network-diagnostics/network-check-target-nngg9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:57:58.451223 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:58.450868 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh podName:a8b58c12-2972-4947-9fea-b3f94f82f207 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:02.450848492 +0000 UTC m=+10.240465487 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dgsxh" (UniqueName: "kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh") pod "network-check-target-nngg9" (UID: "a8b58c12-2972-4947-9fea-b3f94f82f207") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:57:58.715291 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:58.714849 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:57:58.715291 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:58.714971 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbj49" podUID="85e4090a-8dbc-4412-b20a-23d79d838363" Apr 16 20:57:59.715171 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:57:59.714672 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:57:59.715171 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:57:59.714808 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nngg9" podUID="a8b58c12-2972-4947-9fea-b3f94f82f207" Apr 16 20:58:00.715748 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:00.715497 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:58:00.715748 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:00.715659 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbj49" podUID="85e4090a-8dbc-4412-b20a-23d79d838363" Apr 16 20:58:01.714661 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:01.714622 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:01.714842 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:01.714761 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nngg9" podUID="a8b58c12-2972-4947-9fea-b3f94f82f207" Apr 16 20:58:02.382961 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:02.382921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs\") pod \"network-metrics-daemon-zbj49\" (UID: \"85e4090a-8dbc-4412-b20a-23d79d838363\") " pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:58:02.383432 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:02.383100 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:02.383432 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:02.383171 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs podName:85e4090a-8dbc-4412-b20a-23d79d838363 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:10.383151552 +0000 UTC m=+18.172768549 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs") pod "network-metrics-daemon-zbj49" (UID: "85e4090a-8dbc-4412-b20a-23d79d838363") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:02.484404 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:02.484356 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgsxh\" (UniqueName: \"kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh\") pod \"network-check-target-nngg9\" (UID: \"a8b58c12-2972-4947-9fea-b3f94f82f207\") " pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:02.484649 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:02.484582 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:58:02.484649 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:02.484604 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:58:02.484649 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:02.484614 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dgsxh for pod openshift-network-diagnostics/network-check-target-nngg9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:02.484892 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:02.484676 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh podName:a8b58c12-2972-4947-9fea-b3f94f82f207 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:10.484656154 +0000 UTC m=+18.274273137 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dgsxh" (UniqueName: "kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh") pod "network-check-target-nngg9" (UID: "a8b58c12-2972-4947-9fea-b3f94f82f207") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:02.716009 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:02.715929 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:58:02.716167 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:02.716069 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbj49" podUID="85e4090a-8dbc-4412-b20a-23d79d838363" Apr 16 20:58:03.714883 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:03.714842 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:03.715337 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:03.714986 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nngg9" podUID="a8b58c12-2972-4947-9fea-b3f94f82f207" Apr 16 20:58:04.714647 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:04.714566 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:58:04.714785 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:04.714720 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbj49" podUID="85e4090a-8dbc-4412-b20a-23d79d838363" Apr 16 20:58:05.714484 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:05.714430 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:05.714891 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:05.714587 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nngg9" podUID="a8b58c12-2972-4947-9fea-b3f94f82f207" Apr 16 20:58:06.533227 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:06.533197 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qhtpj"] Apr 16 20:58:06.535876 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:06.535853 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:06.536011 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:06.535935 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhtpj" podUID="90b5377e-36eb-4780-83e8-96ffc2917ab6" Apr 16 20:58:06.611494 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:06.611450 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret\") pod \"global-pull-secret-syncer-qhtpj\" (UID: \"90b5377e-36eb-4780-83e8-96ffc2917ab6\") " pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:06.611662 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:06.611512 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/90b5377e-36eb-4780-83e8-96ffc2917ab6-dbus\") pod \"global-pull-secret-syncer-qhtpj\" (UID: \"90b5377e-36eb-4780-83e8-96ffc2917ab6\") " pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:06.611662 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:06.611608 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/90b5377e-36eb-4780-83e8-96ffc2917ab6-kubelet-config\") pod \"global-pull-secret-syncer-qhtpj\" (UID: \"90b5377e-36eb-4780-83e8-96ffc2917ab6\") " pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:06.712028 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:06.711995 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/90b5377e-36eb-4780-83e8-96ffc2917ab6-kubelet-config\") pod \"global-pull-secret-syncer-qhtpj\" (UID: \"90b5377e-36eb-4780-83e8-96ffc2917ab6\") " pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:06.712202 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:06.712051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret\") pod \"global-pull-secret-syncer-qhtpj\" (UID: \"90b5377e-36eb-4780-83e8-96ffc2917ab6\") " pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:06.712202 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:06.712077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/90b5377e-36eb-4780-83e8-96ffc2917ab6-dbus\") pod \"global-pull-secret-syncer-qhtpj\" (UID: \"90b5377e-36eb-4780-83e8-96ffc2917ab6\") " pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:06.712202 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:06.712124 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/90b5377e-36eb-4780-83e8-96ffc2917ab6-kubelet-config\") pod \"global-pull-secret-syncer-qhtpj\" (UID: \"90b5377e-36eb-4780-83e8-96ffc2917ab6\") " pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:06.712370 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:06.712222 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/90b5377e-36eb-4780-83e8-96ffc2917ab6-dbus\") pod \"global-pull-secret-syncer-qhtpj\" (UID: \"90b5377e-36eb-4780-83e8-96ffc2917ab6\") " pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:06.712370 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:06.712221 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:06.712370 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:06.712283 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret podName:90b5377e-36eb-4780-83e8-96ffc2917ab6 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:07.212268982 +0000 UTC m=+15.001885965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret") pod "global-pull-secret-syncer-qhtpj" (UID: "90b5377e-36eb-4780-83e8-96ffc2917ab6") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:06.715158 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:06.715138 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:58:06.715538 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:06.715253 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbj49" podUID="85e4090a-8dbc-4412-b20a-23d79d838363" Apr 16 20:58:07.215031 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:07.214994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret\") pod \"global-pull-secret-syncer-qhtpj\" (UID: \"90b5377e-36eb-4780-83e8-96ffc2917ab6\") " pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:07.215216 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:07.215143 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:07.215216 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:07.215210 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret podName:90b5377e-36eb-4780-83e8-96ffc2917ab6 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:08.215193423 +0000 UTC m=+16.004810407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret") pod "global-pull-secret-syncer-qhtpj" (UID: "90b5377e-36eb-4780-83e8-96ffc2917ab6") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:07.714535 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:07.714452 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:07.714699 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:07.714581 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nngg9" podUID="a8b58c12-2972-4947-9fea-b3f94f82f207" Apr 16 20:58:08.220661 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:08.220626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret\") pod \"global-pull-secret-syncer-qhtpj\" (UID: \"90b5377e-36eb-4780-83e8-96ffc2917ab6\") " pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:08.221068 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:08.220775 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:08.221068 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:08.220852 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret podName:90b5377e-36eb-4780-83e8-96ffc2917ab6 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:10.220831971 +0000 UTC m=+18.010448969 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret") pod "global-pull-secret-syncer-qhtpj" (UID: "90b5377e-36eb-4780-83e8-96ffc2917ab6") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:08.714969 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:08.714895 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:08.715127 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:08.714908 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:58:08.715127 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:08.715030 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhtpj" podUID="90b5377e-36eb-4780-83e8-96ffc2917ab6" Apr 16 20:58:08.715213 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:08.715144 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbj49" podUID="85e4090a-8dbc-4412-b20a-23d79d838363" Apr 16 20:58:09.714724 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:09.714681 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:09.715238 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:09.714826 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nngg9" podUID="a8b58c12-2972-4947-9fea-b3f94f82f207" Apr 16 20:58:10.239580 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:10.239536 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret\") pod \"global-pull-secret-syncer-qhtpj\" (UID: \"90b5377e-36eb-4780-83e8-96ffc2917ab6\") " pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:10.239771 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:10.239705 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:10.239815 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:10.239781 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret podName:90b5377e-36eb-4780-83e8-96ffc2917ab6 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:14.239763592 +0000 UTC m=+22.029380575 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret") pod "global-pull-secret-syncer-qhtpj" (UID: "90b5377e-36eb-4780-83e8-96ffc2917ab6") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:10.441256 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:10.441212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs\") pod \"network-metrics-daemon-zbj49\" (UID: \"85e4090a-8dbc-4412-b20a-23d79d838363\") " pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:58:10.441457 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:10.441383 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:10.441519 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:10.441463 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs podName:85e4090a-8dbc-4412-b20a-23d79d838363 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:26.441433537 +0000 UTC m=+34.231050523 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs") pod "network-metrics-daemon-zbj49" (UID: "85e4090a-8dbc-4412-b20a-23d79d838363") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:58:10.542340 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:10.542259 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgsxh\" (UniqueName: \"kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh\") pod \"network-check-target-nngg9\" (UID: \"a8b58c12-2972-4947-9fea-b3f94f82f207\") " pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:10.542513 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:10.542404 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:58:10.542513 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:10.542421 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:58:10.542513 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:10.542431 2575 projected.go:194] Error preparing data for projected volume kube-api-access-dgsxh for pod openshift-network-diagnostics/network-check-target-nngg9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:10.542513 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:10.542502 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh podName:a8b58c12-2972-4947-9fea-b3f94f82f207 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:26.542486322 +0000 UTC m=+34.332103330 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dgsxh" (UniqueName: "kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh") pod "network-check-target-nngg9" (UID: "a8b58c12-2972-4947-9fea-b3f94f82f207") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:58:10.714739 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:10.714702 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:10.714918 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:10.714702 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:58:10.714918 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:10.714852 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhtpj" podUID="90b5377e-36eb-4780-83e8-96ffc2917ab6" Apr 16 20:58:10.715291 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:10.714923 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbj49" podUID="85e4090a-8dbc-4412-b20a-23d79d838363" Apr 16 20:58:11.714550 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:11.714508 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:11.714738 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:11.714645 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nngg9" podUID="a8b58c12-2972-4947-9fea-b3f94f82f207" Apr 16 20:58:12.715400 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:12.715376 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:12.715799 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:12.715500 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhtpj" podUID="90b5377e-36eb-4780-83e8-96ffc2917ab6" Apr 16 20:58:12.716214 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:12.716030 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:58:12.716214 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:12.716142 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbj49" podUID="85e4090a-8dbc-4412-b20a-23d79d838363" Apr 16 20:58:13.714582 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.714406 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:13.714745 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:13.714673 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nngg9" podUID="a8b58c12-2972-4947-9fea-b3f94f82f207" Apr 16 20:58:13.815458 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.815410 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n7hjd" event={"ID":"63a48313-4beb-4c3e-89bb-bddcb5b76d5a","Type":"ContainerStarted","Data":"0e069da01fc9297563db5330a6cb8847f9ae781e052d0568cdb287b167720dc7"} Apr 16 20:58:13.817046 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.817015 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bbf6s" event={"ID":"8d0ad4e1-5b83-4118-baa5-e8531b28ca54","Type":"ContainerStarted","Data":"b88c299dbc979acbf9c0c916c3c487a064ec8a078e1a7009dbcb934c2e612b96"} Apr 16 20:58:13.818725 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.818698 2575 generic.go:358] "Generic (PLEG): container finished" podID="c6128194-41ca-4dbf-a538-c346ec94bd50" containerID="0b4c7151c43b8a7b037604ed3686919971acbc446778b3f8fabced8f8a731d34" exitCode=0 Apr 16 20:58:13.818810 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.818777 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jcqc" event={"ID":"c6128194-41ca-4dbf-a538-c346ec94bd50","Type":"ContainerDied","Data":"0b4c7151c43b8a7b037604ed3686919971acbc446778b3f8fabced8f8a731d34"} Apr 16 20:58:13.820409 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.820228 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tzlx6" event={"ID":"92c3499b-a30d-4470-8b98-4e3a3b91de06","Type":"ContainerStarted","Data":"d3c6ba69096824e13c27267b7f314b5b42de7cda7106ae8f5d912d707ac29549"} Apr 16 20:58:13.822566 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.822537 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" event={"ID":"a9785ef4-4aae-480f-9acc-3797a9cc8b98","Type":"ContainerStarted","Data":"17aa0303e915c373e3f7214395382d58519dc19759f7bc0f353c3ba26d99262f"} Apr 16 20:58:13.824418 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.824392 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q97vd" event={"ID":"655d4da6-65a0-4fa0-98ef-e68070f32130","Type":"ContainerStarted","Data":"cced3f71563880cc4070f883611055e718e3e137738588015fcfe38d8035b7aa"} Apr 16 20:58:13.827399 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.827381 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzbn8_ffb608ad-9008-418c-a258-d80cf876c140/ovn-acl-logging/0.log" Apr 16 20:58:13.827768 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.827748 2575 generic.go:358] "Generic (PLEG): container finished" podID="ffb608ad-9008-418c-a258-d80cf876c140" containerID="56d113a53036bdd9d48672889790c512e48f786ba7bf26b538b44ca396035cf3" exitCode=1 Apr 16 20:58:13.827855 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.827788 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" event={"ID":"ffb608ad-9008-418c-a258-d80cf876c140","Type":"ContainerStarted","Data":"aa182c04744993d45678d4e91fe4ce4eb8874ba69c3155209eba79aed64cf709"} Apr 16 20:58:13.827855 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.827814 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" event={"ID":"ffb608ad-9008-418c-a258-d80cf876c140","Type":"ContainerStarted","Data":"86395a3b6ada072034124688408ef1c38ab12107cc9b105fe5b57ceadb5847fd"} Apr 16 20:58:13.827855 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.827828 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" event={"ID":"ffb608ad-9008-418c-a258-d80cf876c140","Type":"ContainerStarted","Data":"97c1252b9687e591900fdb89eed125668619538376f920b5c61a122c4a59fff7"} Apr 16 20:58:13.827855 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.827842 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" event={"ID":"ffb608ad-9008-418c-a258-d80cf876c140","Type":"ContainerStarted","Data":"cdcb665eccedd885dfe3c28e931ddc0de4b069456939e2a00b7b9e24fa714338"} Apr 16 20:58:13.827855 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.827855 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" event={"ID":"ffb608ad-9008-418c-a258-d80cf876c140","Type":"ContainerDied","Data":"56d113a53036bdd9d48672889790c512e48f786ba7bf26b538b44ca396035cf3"} Apr 16 20:58:13.828042 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.827870 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" event={"ID":"ffb608ad-9008-418c-a258-d80cf876c140","Type":"ContainerStarted","Data":"16243a40a0c4400ffd8800be530e8ac963b04e5314d5518c573a25168cc41727"} Apr 16 20:58:13.829030 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.829007 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" event={"ID":"3f9f057c-6ace-415c-94e4-339204749514","Type":"ContainerStarted","Data":"11450f6f8a11b22a0f0eebd9268944f6dcd17cafb852f61e5fc97fafaf0391df"} Apr 16 20:58:13.829490 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.829429 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-n7hjd" podStartSLOduration=9.544374452 podStartE2EDuration="21.829415734s" podCreationTimestamp="2026-04-16 20:57:52 +0000 UTC" firstStartedPulling="2026-04-16 20:57:55.515249564 +0000 UTC m=+3.304866551" lastFinishedPulling="2026-04-16 20:58:07.800290836 +0000 UTC m=+15.589907833" observedRunningTime="2026-04-16 20:58:13.829231989 +0000 UTC m=+21.618848995" watchObservedRunningTime="2026-04-16 20:58:13.829415734 +0000 UTC m=+21.619032742" Apr 16 20:58:13.843878 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.843841 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tnqpt" podStartSLOduration=4.7666915979999995 podStartE2EDuration="21.843830464s" podCreationTimestamp="2026-04-16 20:57:52 +0000 UTC" firstStartedPulling="2026-04-16 20:57:55.484849507 +0000 UTC m=+3.274466505" lastFinishedPulling="2026-04-16 20:58:12.561988386 +0000 UTC m=+20.351605371" observedRunningTime="2026-04-16 20:58:13.8438236 +0000 UTC m=+21.633440605" watchObservedRunningTime="2026-04-16 20:58:13.843830464 +0000 UTC m=+21.633447469" Apr 16 20:58:13.868309 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.867924 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bbf6s" podStartSLOduration=4.761883761 podStartE2EDuration="21.867909804s" podCreationTimestamp="2026-04-16 20:57:52 +0000 UTC" firstStartedPulling="2026-04-16 20:57:55.493327592 +0000 UTC m=+3.282944583" lastFinishedPulling="2026-04-16 20:58:12.599353644 +0000 UTC m=+20.388970626" observedRunningTime="2026-04-16 20:58:13.867684938 +0000 UTC m=+21.657301945" watchObservedRunningTime="2026-04-16 20:58:13.867909804 +0000 UTC m=+21.657526812" Apr 16 20:58:13.913136 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.911577 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-q97vd" podStartSLOduration=3.882862696 podStartE2EDuration="20.91155899s" podCreationTimestamp="2026-04-16 20:57:53 +0000 UTC" firstStartedPulling="2026-04-16 20:57:55.515240584 +0000 UTC m=+3.304857567" lastFinishedPulling="2026-04-16 20:58:12.543936862 +0000 UTC m=+20.333553861" observedRunningTime="2026-04-16 20:58:13.886012941 +0000 UTC m=+21.675629943" watchObservedRunningTime="2026-04-16 20:58:13.91155899 +0000 UTC m=+21.701175996" Apr 16 20:58:13.923762 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:13.923723 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tzlx6" podStartSLOduration=4.851714627 podStartE2EDuration="21.923710196s" podCreationTimestamp="2026-04-16 20:57:52 +0000 UTC" firstStartedPulling="2026-04-16 20:57:55.489953401 +0000 UTC m=+3.279570399" lastFinishedPulling="2026-04-16 20:58:12.561948982 +0000 UTC m=+20.351565968" observedRunningTime="2026-04-16 20:58:13.923292897 +0000 UTC m=+21.712909912" watchObservedRunningTime="2026-04-16 20:58:13.923710196 +0000 UTC m=+21.713327201" Apr 16 20:58:14.270998 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:14.270968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret\") pod \"global-pull-secret-syncer-qhtpj\" (UID: \"90b5377e-36eb-4780-83e8-96ffc2917ab6\") " pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:14.271134 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:14.271114 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:14.271200 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:14.271184 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret podName:90b5377e-36eb-4780-83e8-96ffc2917ab6 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:22.271168187 +0000 UTC m=+30.060785172 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret") pod "global-pull-secret-syncer-qhtpj" (UID: "90b5377e-36eb-4780-83e8-96ffc2917ab6") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:14.405392 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:14.405367 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 20:58:14.685830 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:14.685597 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T20:58:14.405382178Z","UUID":"5c89cc4a-790a-4c58-8c2b-60777296eb43","Handler":null,"Name":"","Endpoint":""} Apr 16 20:58:14.688509 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:14.688484 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 20:58:14.688509 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:14.688515 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 20:58:14.714564 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:14.714535 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:58:14.714698 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:14.714666 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbj49" podUID="85e4090a-8dbc-4412-b20a-23d79d838363" Apr 16 20:58:14.714698 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:14.714688 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:14.714827 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:14.714794 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhtpj" podUID="90b5377e-36eb-4780-83e8-96ffc2917ab6" Apr 16 20:58:14.832569 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:14.832530 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hhb77" event={"ID":"c364ce0e-4d4b-44b6-957a-88b603b24167","Type":"ContainerStarted","Data":"6156d8bfc38ec28447bf954e748c871536ba695b33b27c12b9627dd72b4d2d57"} Apr 16 20:58:14.834169 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:14.834139 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" event={"ID":"a9785ef4-4aae-480f-9acc-3797a9cc8b98","Type":"ContainerStarted","Data":"06f551bbeee58637b246f1b60a9339fd50629c6c55e70eb3f658a04d50743110"} Apr 16 20:58:15.714496 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:15.714471 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:15.714624 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:15.714599 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nngg9" podUID="a8b58c12-2972-4947-9fea-b3f94f82f207" Apr 16 20:58:15.837679 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:15.837636 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" event={"ID":"a9785ef4-4aae-480f-9acc-3797a9cc8b98","Type":"ContainerStarted","Data":"5b90459d42540da9838e6e06c244eac7f2003606ec3dd6461e8e6c9bc29b8163"} Apr 16 20:58:15.843335 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:15.843315 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzbn8_ffb608ad-9008-418c-a258-d80cf876c140/ovn-acl-logging/0.log" Apr 16 20:58:15.843687 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:15.843661 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" event={"ID":"ffb608ad-9008-418c-a258-d80cf876c140","Type":"ContainerStarted","Data":"5899efce26d25c7376a5d5df164fe0cbb16018a8b91ca556e20f8171d0af76d3"} Apr 16 20:58:15.857691 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:15.857642 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hhb77" podStartSLOduration=6.784250394 podStartE2EDuration="23.857626437s" podCreationTimestamp="2026-04-16 20:57:52 +0000 UTC" firstStartedPulling="2026-04-16 20:57:55.488616342 +0000 UTC m=+3.278233326" lastFinishedPulling="2026-04-16 20:58:12.561992367 +0000 UTC m=+20.351609369" observedRunningTime="2026-04-16 20:58:14.858257032 +0000 UTC m=+22.647874037" watchObservedRunningTime="2026-04-16 20:58:15.857626437 +0000 UTC m=+23.647243445" Apr 16 20:58:15.857857 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:15.857831 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d6b6x" podStartSLOduration=2.7037606690000002 podStartE2EDuration="22.857823801s" podCreationTimestamp="2026-04-16 20:57:53 +0000 UTC" firstStartedPulling="2026-04-16 20:57:55.518563563 +0000 UTC m=+3.308180552" lastFinishedPulling="2026-04-16 20:58:15.672626699 +0000 UTC m=+23.462243684" observedRunningTime="2026-04-16 20:58:15.856888737 +0000 UTC m=+23.646505753" watchObservedRunningTime="2026-04-16 20:58:15.857823801 +0000 UTC m=+23.647440831" Apr 16 20:58:16.714834 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:16.714801 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:16.715003 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:16.714976 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhtpj" podUID="90b5377e-36eb-4780-83e8-96ffc2917ab6" Apr 16 20:58:16.715090 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:16.715051 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:58:16.715189 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:16.715163 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbj49" podUID="85e4090a-8dbc-4412-b20a-23d79d838363" Apr 16 20:58:17.714765 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:17.714740 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:17.715288 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:17.714834 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nngg9" podUID="a8b58c12-2972-4947-9fea-b3f94f82f207" Apr 16 20:58:18.247564 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:18.247524 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-q97vd" Apr 16 20:58:18.248224 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:18.248198 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-q97vd" Apr 16 20:58:18.714527 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:18.714498 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:18.714527 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:18.714522 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:58:18.714673 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:18.714624 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhtpj" podUID="90b5377e-36eb-4780-83e8-96ffc2917ab6" Apr 16 20:58:18.714762 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:18.714735 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbj49" podUID="85e4090a-8dbc-4412-b20a-23d79d838363" Apr 16 20:58:18.852783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:18.852572 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzbn8_ffb608ad-9008-418c-a258-d80cf876c140/ovn-acl-logging/0.log" Apr 16 20:58:18.853522 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:18.853086 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" event={"ID":"ffb608ad-9008-418c-a258-d80cf876c140","Type":"ContainerStarted","Data":"907bd0598a455a01445483d80b20c2cea14a4faea2f472eaa7a88bea38fe1782"} Apr 16 20:58:18.853854 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:18.853542 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-q97vd" Apr 16 20:58:18.853854 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:18.853653 2575 scope.go:117] "RemoveContainer" containerID="56d113a53036bdd9d48672889790c512e48f786ba7bf26b538b44ca396035cf3" Apr 16 20:58:18.854741 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:18.854203 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-q97vd" Apr 16 20:58:19.714931 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:19.714907 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:19.715052 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:19.715000 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nngg9" podUID="a8b58c12-2972-4947-9fea-b3f94f82f207" Apr 16 20:58:19.856505 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:19.856473 2575 generic.go:358] "Generic (PLEG): container finished" podID="c6128194-41ca-4dbf-a538-c346ec94bd50" containerID="fcc9228603779e3920ec73a392a257f4f4c73600844c72f1e0b16cc6e271408a" exitCode=0 Apr 16 20:58:19.856963 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:19.856556 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jcqc" event={"ID":"c6128194-41ca-4dbf-a538-c346ec94bd50","Type":"ContainerDied","Data":"fcc9228603779e3920ec73a392a257f4f4c73600844c72f1e0b16cc6e271408a"} Apr 16 20:58:19.859688 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:19.859646 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzbn8_ffb608ad-9008-418c-a258-d80cf876c140/ovn-acl-logging/0.log" Apr 16 20:58:19.859950 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:19.859922 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" event={"ID":"ffb608ad-9008-418c-a258-d80cf876c140","Type":"ContainerStarted","Data":"8de7365a5bab5f70bc7935c1cbda1e2e3bead026a758fa999cbc81d36ad1445c"} Apr 16 20:58:19.860084 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:19.860068 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 20:58:19.860316 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:19.860292 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:58:19.860430 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:19.860325 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:58:19.874493 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:19.874467 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:58:19.874618 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:19.874569 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:58:19.928294 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:19.928254 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" podStartSLOduration=10.802268015 podStartE2EDuration="27.928240776s" podCreationTimestamp="2026-04-16 20:57:52 +0000 UTC" firstStartedPulling="2026-04-16 20:57:55.515298164 +0000 UTC m=+3.304915147" lastFinishedPulling="2026-04-16 20:58:12.641270911 +0000 UTC m=+20.430887908" observedRunningTime="2026-04-16 20:58:19.928067374 +0000 UTC m=+27.717684378" watchObservedRunningTime="2026-04-16 20:58:19.928240776 +0000 UTC m=+27.717857780" Apr 16 20:58:20.707808 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:20.707776 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nngg9"] Apr 16 20:58:20.707979 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:20.707899 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:20.708028 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:20.707984 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nngg9" podUID="a8b58c12-2972-4947-9fea-b3f94f82f207" Apr 16 20:58:20.710055 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:20.710031 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zbj49"] Apr 16 20:58:20.710164 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:20.710134 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:58:20.710226 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:20.710210 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbj49" podUID="85e4090a-8dbc-4412-b20a-23d79d838363" Apr 16 20:58:20.714627 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:20.714581 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:20.714706 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:20.714668 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhtpj" podUID="90b5377e-36eb-4780-83e8-96ffc2917ab6" Apr 16 20:58:20.721467 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:20.721426 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qhtpj"] Apr 16 20:58:20.863405 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:20.863371 2575 generic.go:358] "Generic (PLEG): container finished" podID="c6128194-41ca-4dbf-a538-c346ec94bd50" containerID="b66b0aa6e98cd927c4fb9aea4d32518ae36f1f9ce43025e1d0c3f2d8392e2db4" exitCode=0 Apr 16 20:58:20.863787 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:20.863471 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:20.863787 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:20.863486 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jcqc" event={"ID":"c6128194-41ca-4dbf-a538-c346ec94bd50","Type":"ContainerDied","Data":"b66b0aa6e98cd927c4fb9aea4d32518ae36f1f9ce43025e1d0c3f2d8392e2db4"} Apr 16 20:58:20.863787 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:20.863654 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 20:58:20.864491 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:20.864076 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhtpj" podUID="90b5377e-36eb-4780-83e8-96ffc2917ab6" Apr 16 20:58:21.304408 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:21.304376 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:58:21.867959 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:21.867733 2575 generic.go:358] "Generic (PLEG): container finished" podID="c6128194-41ca-4dbf-a538-c346ec94bd50" containerID="e4b2e5de3b55ccbdb35bcc969bfea66b5b891f4e23a2b85d6a5a430a6af64940" exitCode=0 Apr 16 20:58:21.868244 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:21.867770 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jcqc" event={"ID":"c6128194-41ca-4dbf-a538-c346ec94bd50","Type":"ContainerDied","Data":"e4b2e5de3b55ccbdb35bcc969bfea66b5b891f4e23a2b85d6a5a430a6af64940"} Apr 16 20:58:22.331535 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:22.331473 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret\") pod \"global-pull-secret-syncer-qhtpj\" (UID: \"90b5377e-36eb-4780-83e8-96ffc2917ab6\") " pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:22.331772 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:22.331654 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:22.331772 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:22.331733 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret podName:90b5377e-36eb-4780-83e8-96ffc2917ab6 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:38.331712987 +0000 UTC m=+46.121329973 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret") pod "global-pull-secret-syncer-qhtpj" (UID: "90b5377e-36eb-4780-83e8-96ffc2917ab6") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:58:22.715393 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:22.715360 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:58:22.715598 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:22.715465 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:22.715598 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:22.715485 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zbj49" podUID="85e4090a-8dbc-4412-b20a-23d79d838363" Apr 16 20:58:22.715598 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:22.715510 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:22.715763 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:22.715608 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qhtpj" podUID="90b5377e-36eb-4780-83e8-96ffc2917ab6" Apr 16 20:58:22.715763 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:22.715716 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nngg9" podUID="a8b58c12-2972-4947-9fea-b3f94f82f207" Apr 16 20:58:24.575545 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.575501 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-171.ec2.internal" event="NodeReady" Apr 16 20:58:24.576136 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.575669 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 20:58:24.630662 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.630634 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd"] Apr 16 20:58:24.665187 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.665157 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-74c85f5b8b-5nkg7"] Apr 16 20:58:24.665347 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.665253 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 20:58:24.668653 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.668552 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-ggv4b\"" Apr 16 20:58:24.668653 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.668553 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 20:58:24.668653 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.668613 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 20:58:24.679240 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.679191 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd"] Apr 16 20:58:24.679240 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.679218 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-72rld"] Apr 16 20:58:24.679398 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.679339 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.684989 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.684966 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 20:58:24.685646 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.685625 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 20:58:24.685765 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.685708 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-pqqcq\"" Apr 16 20:58:24.685860 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.685838 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 20:58:24.692088 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.692068 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 20:58:24.697156 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.697134 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-74c85f5b8b-5nkg7"] Apr 16 20:58:24.697258 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.697161 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-72rld"] Apr 16 20:58:24.697318 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.697280 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 20:58:24.700629 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.700608 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 20:58:24.700629 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.700620 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 20:58:24.700783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.700688 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 20:58:24.700783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.700619 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mhzpf\"" Apr 16 20:58:24.714603 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.714582 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:24.714697 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.714605 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:24.714697 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.714676 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:58:24.718924 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.718905 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:58:24.719017 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.718910 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c7977\"" Apr 16 20:58:24.719174 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.719157 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:58:24.719305 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.719288 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:58:24.719392 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.719352 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:58:24.719392 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.719378 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lvzr6\"" Apr 16 20:58:24.748313 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.748286 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ck6cd\" (UID: \"7852b3f2-db76-4624-a66c-450474aeaa93\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 20:58:24.748313 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.748326 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81c67e7e-270f-4265-a65a-8caa7e5da99f-trusted-ca\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.748504 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.748356 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.748504 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.748489 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81c67e7e-270f-4265-a65a-8caa7e5da99f-installation-pull-secrets\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.748585 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.748519 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-bound-sa-token\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.748585 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.748550 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7852b3f2-db76-4624-a66c-450474aeaa93-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ck6cd\" (UID: \"7852b3f2-db76-4624-a66c-450474aeaa93\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 20:58:24.748694 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.748643 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/81c67e7e-270f-4265-a65a-8caa7e5da99f-image-registry-private-configuration\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.748747 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.748693 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-certificates\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.748747 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.748730 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gvbw\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-kube-api-access-4gvbw\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.748829 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.748781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81c67e7e-270f-4265-a65a-8caa7e5da99f-ca-trust-extracted\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.760112 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.760089 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-b7bcv"] Apr 16 20:58:24.775683 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.775516 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:24.776197 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.776148 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-b7bcv"] Apr 16 20:58:24.778803 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.778778 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 20:58:24.778908 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.778866 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 20:58:24.778974 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.778925 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kfmp4\"" Apr 16 20:58:24.849207 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849177 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-certificates\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.849404 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849225 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gvbw\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-kube-api-access-4gvbw\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.849404 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849251 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6767bcb0-f122-44f9-b378-3f18e741a065-tmp-dir\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:24.849548 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849491 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81c67e7e-270f-4265-a65a-8caa7e5da99f-ca-trust-extracted\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.849548 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849535 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wskr\" (UniqueName: \"kubernetes.io/projected/6767bcb0-f122-44f9-b378-3f18e741a065-kube-api-access-2wskr\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:24.849656 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ck6cd\" (UID: \"7852b3f2-db76-4624-a66c-450474aeaa93\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 20:58:24.849656 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849617 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81c67e7e-270f-4265-a65a-8caa7e5da99f-trusted-ca\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.849656 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849646 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.849806 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849672 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:24.849806 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849702 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g8c8\" (UniqueName: \"kubernetes.io/projected/97e00ea1-e79f-4a6f-b820-0cafb65f4308-kube-api-access-2g8c8\") pod \"ingress-canary-72rld\" (UID: \"97e00ea1-e79f-4a6f-b820-0cafb65f4308\") " pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 20:58:24.849806 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81c67e7e-270f-4265-a65a-8caa7e5da99f-installation-pull-secrets\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.849806 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849767 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-bound-sa-token\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.849806 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849799 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7852b3f2-db76-4624-a66c-450474aeaa93-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ck6cd\" (UID: \"7852b3f2-db76-4624-a66c-450474aeaa93\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 20:58:24.850029 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849830 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-certificates\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.850029 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849856 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert\") pod \"ingress-canary-72rld\" (UID: \"97e00ea1-e79f-4a6f-b820-0cafb65f4308\") " pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 20:58:24.850029 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/81c67e7e-270f-4265-a65a-8caa7e5da99f-image-registry-private-configuration\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.850029 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:24.849935 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:58:24.850029 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849962 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6767bcb0-f122-44f9-b378-3f18e741a065-config-volume\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:24.850029 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.849970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81c67e7e-270f-4265-a65a-8caa7e5da99f-ca-trust-extracted\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.850029 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:24.850029 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert podName:7852b3f2-db76-4624-a66c-450474aeaa93 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:25.349976384 +0000 UTC m=+33.139593373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ck6cd" (UID: "7852b3f2-db76-4624-a66c-450474aeaa93") : secret "networking-console-plugin-cert" not found Apr 16 20:58:24.850358 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:24.850341 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:58:24.850358 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:24.850355 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74c85f5b8b-5nkg7: secret "image-registry-tls" not found Apr 16 20:58:24.850470 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:24.850410 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls podName:81c67e7e-270f-4265-a65a-8caa7e5da99f nodeName:}" failed. No retries permitted until 2026-04-16 20:58:25.350391366 +0000 UTC m=+33.140008349 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls") pod "image-registry-74c85f5b8b-5nkg7" (UID: "81c67e7e-270f-4265-a65a-8caa7e5da99f") : secret "image-registry-tls" not found Apr 16 20:58:24.850718 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.850690 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7852b3f2-db76-4624-a66c-450474aeaa93-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ck6cd\" (UID: \"7852b3f2-db76-4624-a66c-450474aeaa93\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 20:58:24.850955 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.850935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81c67e7e-270f-4265-a65a-8caa7e5da99f-trusted-ca\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.854158 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.854137 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81c67e7e-270f-4265-a65a-8caa7e5da99f-installation-pull-secrets\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.854313 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.854208 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/81c67e7e-270f-4265-a65a-8caa7e5da99f-image-registry-private-configuration\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.859860 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.859702 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gvbw\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-kube-api-access-4gvbw\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.862068 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.862024 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-bound-sa-token\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:24.950565 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.950492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6767bcb0-f122-44f9-b378-3f18e741a065-tmp-dir\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:24.950565 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.950557 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wskr\" (UniqueName: \"kubernetes.io/projected/6767bcb0-f122-44f9-b378-3f18e741a065-kube-api-access-2wskr\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:24.950751 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.950670 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:24.950751 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.950697 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2g8c8\" (UniqueName: \"kubernetes.io/projected/97e00ea1-e79f-4a6f-b820-0cafb65f4308-kube-api-access-2g8c8\") pod \"ingress-canary-72rld\" (UID: \"97e00ea1-e79f-4a6f-b820-0cafb65f4308\") " pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 20:58:24.950857 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.950754 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert\") pod \"ingress-canary-72rld\" (UID: \"97e00ea1-e79f-4a6f-b820-0cafb65f4308\") " pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 20:58:24.950857 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:24.950763 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:24.950857 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.950785 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6767bcb0-f122-44f9-b378-3f18e741a065-config-volume\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:24.951303 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.950856 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6767bcb0-f122-44f9-b378-3f18e741a065-tmp-dir\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:24.951303 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:24.951299 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls podName:6767bcb0-f122-44f9-b378-3f18e741a065 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:25.450808409 +0000 UTC m=+33.240425398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls") pod "dns-default-b7bcv" (UID: "6767bcb0-f122-44f9-b378-3f18e741a065") : secret "dns-default-metrics-tls" not found Apr 16 20:58:24.951560 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:24.951543 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:24.951646 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.951574 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6767bcb0-f122-44f9-b378-3f18e741a065-config-volume\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:24.951646 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:24.951617 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert podName:97e00ea1-e79f-4a6f-b820-0cafb65f4308 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:25.451599191 +0000 UTC m=+33.241216186 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert") pod "ingress-canary-72rld" (UID: "97e00ea1-e79f-4a6f-b820-0cafb65f4308") : secret "canary-serving-cert" not found Apr 16 20:58:24.968849 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.968827 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wskr\" (UniqueName: \"kubernetes.io/projected/6767bcb0-f122-44f9-b378-3f18e741a065-kube-api-access-2wskr\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:24.973799 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:24.973760 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g8c8\" (UniqueName: \"kubernetes.io/projected/97e00ea1-e79f-4a6f-b820-0cafb65f4308-kube-api-access-2g8c8\") pod \"ingress-canary-72rld\" (UID: \"97e00ea1-e79f-4a6f-b820-0cafb65f4308\") " pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 20:58:25.355257 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:25.355174 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ck6cd\" (UID: \"7852b3f2-db76-4624-a66c-450474aeaa93\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 20:58:25.355257 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:25.355224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:25.355544 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:25.355354 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:58:25.355544 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:25.355370 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74c85f5b8b-5nkg7: secret "image-registry-tls" not found Apr 16 20:58:25.355544 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:25.355391 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:58:25.355544 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:25.355466 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls podName:81c67e7e-270f-4265-a65a-8caa7e5da99f nodeName:}" failed. No retries permitted until 2026-04-16 20:58:26.35542866 +0000 UTC m=+34.145045660 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls") pod "image-registry-74c85f5b8b-5nkg7" (UID: "81c67e7e-270f-4265-a65a-8caa7e5da99f") : secret "image-registry-tls" not found Apr 16 20:58:25.355544 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:25.355484 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert podName:7852b3f2-db76-4624-a66c-450474aeaa93 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:26.355475061 +0000 UTC m=+34.145092046 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ck6cd" (UID: "7852b3f2-db76-4624-a66c-450474aeaa93") : secret "networking-console-plugin-cert" not found Apr 16 20:58:25.456604 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:25.456402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert\") pod \"ingress-canary-72rld\" (UID: \"97e00ea1-e79f-4a6f-b820-0cafb65f4308\") " pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 20:58:25.456604 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:25.456548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:25.456885 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:25.456664 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:25.456885 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:25.456741 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert podName:97e00ea1-e79f-4a6f-b820-0cafb65f4308 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:26.45672682 +0000 UTC m=+34.246343803 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert") pod "ingress-canary-72rld" (UID: "97e00ea1-e79f-4a6f-b820-0cafb65f4308") : secret "canary-serving-cert" not found Apr 16 20:58:25.456885 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:25.456664 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:25.456885 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:25.456850 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls podName:6767bcb0-f122-44f9-b378-3f18e741a065 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:26.456829044 +0000 UTC m=+34.246446046 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls") pod "dns-default-b7bcv" (UID: "6767bcb0-f122-44f9-b378-3f18e741a065") : secret "dns-default-metrics-tls" not found Apr 16 20:58:26.363647 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:26.363607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ck6cd\" (UID: \"7852b3f2-db76-4624-a66c-450474aeaa93\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 20:58:26.364497 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:26.363656 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:26.364497 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:26.363761 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:58:26.364497 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:26.363775 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74c85f5b8b-5nkg7: secret "image-registry-tls" not found Apr 16 20:58:26.364497 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:26.363772 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:58:26.364497 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:26.363836 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls podName:81c67e7e-270f-4265-a65a-8caa7e5da99f nodeName:}" failed. No retries permitted until 2026-04-16 20:58:28.363821992 +0000 UTC m=+36.153438985 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls") pod "image-registry-74c85f5b8b-5nkg7" (UID: "81c67e7e-270f-4265-a65a-8caa7e5da99f") : secret "image-registry-tls" not found Apr 16 20:58:26.364497 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:26.363851 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert podName:7852b3f2-db76-4624-a66c-450474aeaa93 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:28.363843086 +0000 UTC m=+36.153460068 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ck6cd" (UID: "7852b3f2-db76-4624-a66c-450474aeaa93") : secret "networking-console-plugin-cert" not found Apr 16 20:58:26.464806 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:26.464767 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:26.464985 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:26.464831 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs\") pod \"network-metrics-daemon-zbj49\" (UID: \"85e4090a-8dbc-4412-b20a-23d79d838363\") " pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:58:26.464985 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:26.464863 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert\") pod \"ingress-canary-72rld\" (UID: \"97e00ea1-e79f-4a6f-b820-0cafb65f4308\") " pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 20:58:26.464985 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:26.464918 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:26.464985 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:26.464973 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:26.465174 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:26.464994 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:58:26.465174 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:26.464999 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls podName:6767bcb0-f122-44f9-b378-3f18e741a065 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:28.464979492 +0000 UTC m=+36.254596493 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls") pod "dns-default-b7bcv" (UID: "6767bcb0-f122-44f9-b378-3f18e741a065") : secret "dns-default-metrics-tls" not found Apr 16 20:58:26.465174 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:26.465065 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert podName:97e00ea1-e79f-4a6f-b820-0cafb65f4308 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:28.465048383 +0000 UTC m=+36.254665384 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert") pod "ingress-canary-72rld" (UID: "97e00ea1-e79f-4a6f-b820-0cafb65f4308") : secret "canary-serving-cert" not found Apr 16 20:58:26.465174 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:26.465076 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs podName:85e4090a-8dbc-4412-b20a-23d79d838363 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:58.465070369 +0000 UTC m=+66.254687351 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs") pod "network-metrics-daemon-zbj49" (UID: "85e4090a-8dbc-4412-b20a-23d79d838363") : secret "metrics-daemon-secret" not found Apr 16 20:58:26.565874 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:26.565836 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgsxh\" (UniqueName: \"kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh\") pod \"network-check-target-nngg9\" (UID: \"a8b58c12-2972-4947-9fea-b3f94f82f207\") " pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:26.568956 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:26.568928 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgsxh\" (UniqueName: \"kubernetes.io/projected/a8b58c12-2972-4947-9fea-b3f94f82f207-kube-api-access-dgsxh\") pod \"network-check-target-nngg9\" (UID: \"a8b58c12-2972-4947-9fea-b3f94f82f207\") " pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:26.825175 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:26.825137 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:27.652253 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:27.652093 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nngg9"] Apr 16 20:58:27.733084 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:58:27.733055 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8b58c12_2972_4947_9fea_b3f94f82f207.slice/crio-330972d494e094706beb67f4666c24775f8685d4a324849c0bc1391573338039 WatchSource:0}: Error finding container 330972d494e094706beb67f4666c24775f8685d4a324849c0bc1391573338039: Status 404 returned error can't find the container with id 330972d494e094706beb67f4666c24775f8685d4a324849c0bc1391573338039 Apr 16 20:58:27.880210 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:27.880186 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nngg9" event={"ID":"a8b58c12-2972-4947-9fea-b3f94f82f207","Type":"ContainerStarted","Data":"330972d494e094706beb67f4666c24775f8685d4a324849c0bc1391573338039"} Apr 16 20:58:28.382290 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:28.382214 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ck6cd\" (UID: \"7852b3f2-db76-4624-a66c-450474aeaa93\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 20:58:28.382290 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:28.382251 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:28.382523 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:28.382410 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:58:28.382523 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:28.382426 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74c85f5b8b-5nkg7: secret "image-registry-tls" not found Apr 16 20:58:28.382523 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:28.382409 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:58:28.382523 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:28.382519 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert podName:7852b3f2-db76-4624-a66c-450474aeaa93 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:32.382497058 +0000 UTC m=+40.172114043 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ck6cd" (UID: "7852b3f2-db76-4624-a66c-450474aeaa93") : secret "networking-console-plugin-cert" not found Apr 16 20:58:28.382699 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:28.382538 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls podName:81c67e7e-270f-4265-a65a-8caa7e5da99f nodeName:}" failed. No retries permitted until 2026-04-16 20:58:32.382528821 +0000 UTC m=+40.172145808 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls") pod "image-registry-74c85f5b8b-5nkg7" (UID: "81c67e7e-270f-4265-a65a-8caa7e5da99f") : secret "image-registry-tls" not found Apr 16 20:58:28.483401 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:28.483368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:28.483570 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:28.483454 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert\") pod \"ingress-canary-72rld\" (UID: \"97e00ea1-e79f-4a6f-b820-0cafb65f4308\") " pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 20:58:28.483570 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:28.483504 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:28.483570 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:28.483557 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:28.483692 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:28.483589 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls podName:6767bcb0-f122-44f9-b378-3f18e741a065 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:32.483562091 +0000 UTC m=+40.273179077 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls") pod "dns-default-b7bcv" (UID: "6767bcb0-f122-44f9-b378-3f18e741a065") : secret "dns-default-metrics-tls" not found Apr 16 20:58:28.483692 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:28.483610 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert podName:97e00ea1-e79f-4a6f-b820-0cafb65f4308 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:32.483595813 +0000 UTC m=+40.273212797 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert") pod "ingress-canary-72rld" (UID: "97e00ea1-e79f-4a6f-b820-0cafb65f4308") : secret "canary-serving-cert" not found Apr 16 20:58:28.885583 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:28.885546 2575 generic.go:358] "Generic (PLEG): container finished" podID="c6128194-41ca-4dbf-a538-c346ec94bd50" containerID="a94c7849903967ea27756afdf64f21b8b69097f6f72af5218a6c1bcd5afba827" exitCode=0 Apr 16 20:58:28.886058 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:28.885601 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jcqc" event={"ID":"c6128194-41ca-4dbf-a538-c346ec94bd50","Type":"ContainerDied","Data":"a94c7849903967ea27756afdf64f21b8b69097f6f72af5218a6c1bcd5afba827"} Apr 16 20:58:29.891120 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:29.891082 2575 generic.go:358] "Generic (PLEG): container finished" podID="c6128194-41ca-4dbf-a538-c346ec94bd50" containerID="8007accc25177d4c41b9a07dabebbc7abb1f0cd3657fd739907a01f25dd12c8d" exitCode=0 Apr 16 20:58:29.891632 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:29.891150 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jcqc" event={"ID":"c6128194-41ca-4dbf-a538-c346ec94bd50","Type":"ContainerDied","Data":"8007accc25177d4c41b9a07dabebbc7abb1f0cd3657fd739907a01f25dd12c8d"} Apr 16 20:58:30.896121 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:30.896047 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jcqc" event={"ID":"c6128194-41ca-4dbf-a538-c346ec94bd50","Type":"ContainerStarted","Data":"fc9c0c6d2d88f9580fb5c379ef538e3f2d9e515f74f2961ea38206e010dbda86"} Apr 16 20:58:30.922933 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:30.922877 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8jcqc" podStartSLOduration=6.652688826 podStartE2EDuration="38.922862706s" podCreationTimestamp="2026-04-16 20:57:52 +0000 UTC" firstStartedPulling="2026-04-16 20:57:55.491974544 +0000 UTC m=+3.281591538" lastFinishedPulling="2026-04-16 20:58:27.762148431 +0000 UTC m=+35.551765418" observedRunningTime="2026-04-16 20:58:30.92089462 +0000 UTC m=+38.710511645" watchObservedRunningTime="2026-04-16 20:58:30.922862706 +0000 UTC m=+38.712479711" Apr 16 20:58:31.899280 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:31.899245 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nngg9" event={"ID":"a8b58c12-2972-4947-9fea-b3f94f82f207","Type":"ContainerStarted","Data":"57292795d3d336cc484cbb6ece413b00e9276f08ac9006eab4a91dc7306bbcd7"} Apr 16 20:58:31.899780 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:31.899672 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:58:32.416473 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:32.416425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ck6cd\" (UID: \"7852b3f2-db76-4624-a66c-450474aeaa93\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 20:58:32.416473 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:32.416479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:32.416668 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:32.416561 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:58:32.416668 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:32.416582 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:58:32.416668 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:32.416593 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74c85f5b8b-5nkg7: secret "image-registry-tls" not found Apr 16 20:58:32.416668 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:32.416624 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert podName:7852b3f2-db76-4624-a66c-450474aeaa93 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:40.416609124 +0000 UTC m=+48.206226107 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ck6cd" (UID: "7852b3f2-db76-4624-a66c-450474aeaa93") : secret "networking-console-plugin-cert" not found Apr 16 20:58:32.416668 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:32.416637 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls podName:81c67e7e-270f-4265-a65a-8caa7e5da99f nodeName:}" failed. No retries permitted until 2026-04-16 20:58:40.416631513 +0000 UTC m=+48.206248495 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls") pod "image-registry-74c85f5b8b-5nkg7" (UID: "81c67e7e-270f-4265-a65a-8caa7e5da99f") : secret "image-registry-tls" not found Apr 16 20:58:32.517538 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:32.517504 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:32.517680 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:32.517565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert\") pod \"ingress-canary-72rld\" (UID: \"97e00ea1-e79f-4a6f-b820-0cafb65f4308\") " pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 20:58:32.517680 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:32.517639 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:32.517680 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:32.517651 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:32.517803 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:32.517703 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert podName:97e00ea1-e79f-4a6f-b820-0cafb65f4308 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:40.517683482 +0000 UTC m=+48.307300465 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert") pod "ingress-canary-72rld" (UID: "97e00ea1-e79f-4a6f-b820-0cafb65f4308") : secret "canary-serving-cert" not found Apr 16 20:58:32.517803 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:32.517727 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls podName:6767bcb0-f122-44f9-b378-3f18e741a065 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:40.517717057 +0000 UTC m=+48.307334041 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls") pod "dns-default-b7bcv" (UID: "6767bcb0-f122-44f9-b378-3f18e741a065") : secret "dns-default-metrics-tls" not found Apr 16 20:58:34.928109 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.928061 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-nngg9" podStartSLOduration=39.67872098 podStartE2EDuration="42.928046573s" podCreationTimestamp="2026-04-16 20:57:52 +0000 UTC" firstStartedPulling="2026-04-16 20:58:27.740371499 +0000 UTC m=+35.529988482" lastFinishedPulling="2026-04-16 20:58:30.989697089 +0000 UTC m=+38.779314075" observedRunningTime="2026-04-16 20:58:31.915668192 +0000 UTC m=+39.705285196" watchObservedRunningTime="2026-04-16 20:58:34.928046573 +0000 UTC m=+42.717663555" Apr 16 20:58:34.928487 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.928265 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c6499d69c-tj2hp"] Apr 16 20:58:34.933646 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.933626 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d"] Apr 16 20:58:34.933795 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.933776 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c6499d69c-tj2hp" Apr 16 20:58:34.937184 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.937162 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" Apr 16 20:58:34.937794 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.937773 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 20:58:34.937892 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.937791 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 20:58:34.937892 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.937780 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 20:58:34.937892 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.937850 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 20:58:34.938029 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.937896 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-xvn55\"" Apr 16 20:58:34.939617 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.939597 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 20:58:34.939964 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.939941 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c6499d69c-tj2hp"] Apr 16 20:58:34.943048 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.943029 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d"] Apr 16 20:58:34.945670 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.945650 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk"] Apr 16 20:58:34.949297 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.949281 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:34.951817 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.951797 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 20:58:34.951905 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.951834 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 20:58:34.952003 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.951840 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 20:58:34.952003 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.951956 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 20:58:34.959599 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:34.959567 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk"] Apr 16 20:58:35.035383 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.035336 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wchjn\" (UniqueName: \"kubernetes.io/projected/4910ea1a-2750-4912-8fcb-8242e5118e32-kube-api-access-wchjn\") pod \"klusterlet-addon-workmgr-6558dfbfd8-85z9d\" (UID: \"4910ea1a-2750-4912-8fcb-8242e5118e32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" Apr 16 20:58:35.035531 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.035397 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drfq5\" (UniqueName: \"kubernetes.io/projected/513ff303-5311-4cf5-8916-d5c41fdf0c3b-kube-api-access-drfq5\") pod \"managed-serviceaccount-addon-agent-c6499d69c-tj2hp\" (UID: \"513ff303-5311-4cf5-8916-d5c41fdf0c3b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c6499d69c-tj2hp" Apr 16 20:58:35.035531 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.035469 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4910ea1a-2750-4912-8fcb-8242e5118e32-tmp\") pod \"klusterlet-addon-workmgr-6558dfbfd8-85z9d\" (UID: \"4910ea1a-2750-4912-8fcb-8242e5118e32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" Apr 16 20:58:35.035531 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.035508 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4910ea1a-2750-4912-8fcb-8242e5118e32-klusterlet-config\") pod \"klusterlet-addon-workmgr-6558dfbfd8-85z9d\" (UID: \"4910ea1a-2750-4912-8fcb-8242e5118e32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" Apr 16 20:58:35.035686 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.035555 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/513ff303-5311-4cf5-8916-d5c41fdf0c3b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-c6499d69c-tj2hp\" (UID: \"513ff303-5311-4cf5-8916-d5c41fdf0c3b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c6499d69c-tj2hp" Apr 16 20:58:35.136783 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.136752 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/69266b49-8b1f-4643-accb-e8a20922a94e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.136930 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.136796 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/69266b49-8b1f-4643-accb-e8a20922a94e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.136930 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.136818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/69266b49-8b1f-4643-accb-e8a20922a94e-hub\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.136930 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.136897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wchjn\" (UniqueName: \"kubernetes.io/projected/4910ea1a-2750-4912-8fcb-8242e5118e32-kube-api-access-wchjn\") pod \"klusterlet-addon-workmgr-6558dfbfd8-85z9d\" (UID: \"4910ea1a-2750-4912-8fcb-8242e5118e32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" Apr 16 20:58:35.137027 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.136985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4910ea1a-2750-4912-8fcb-8242e5118e32-tmp\") pod \"klusterlet-addon-workmgr-6558dfbfd8-85z9d\" (UID: \"4910ea1a-2750-4912-8fcb-8242e5118e32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" Apr 16 20:58:35.137027 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.137013 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4910ea1a-2750-4912-8fcb-8242e5118e32-klusterlet-config\") pod \"klusterlet-addon-workmgr-6558dfbfd8-85z9d\" (UID: \"4910ea1a-2750-4912-8fcb-8242e5118e32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" Apr 16 20:58:35.137117 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.137033 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/69266b49-8b1f-4643-accb-e8a20922a94e-ca\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.137117 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.137064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/513ff303-5311-4cf5-8916-d5c41fdf0c3b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-c6499d69c-tj2hp\" (UID: \"513ff303-5311-4cf5-8916-d5c41fdf0c3b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c6499d69c-tj2hp" Apr 16 20:58:35.137189 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.137116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drfq5\" (UniqueName: \"kubernetes.io/projected/513ff303-5311-4cf5-8916-d5c41fdf0c3b-kube-api-access-drfq5\") pod \"managed-serviceaccount-addon-agent-c6499d69c-tj2hp\" (UID: \"513ff303-5311-4cf5-8916-d5c41fdf0c3b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c6499d69c-tj2hp" Apr 16 20:58:35.137189 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.137168 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t974r\" (UniqueName: \"kubernetes.io/projected/69266b49-8b1f-4643-accb-e8a20922a94e-kube-api-access-t974r\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.137282 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.137203 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/69266b49-8b1f-4643-accb-e8a20922a94e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.137432 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.137413 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4910ea1a-2750-4912-8fcb-8242e5118e32-tmp\") pod \"klusterlet-addon-workmgr-6558dfbfd8-85z9d\" (UID: \"4910ea1a-2750-4912-8fcb-8242e5118e32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" Apr 16 20:58:35.140744 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.140724 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4910ea1a-2750-4912-8fcb-8242e5118e32-klusterlet-config\") pod \"klusterlet-addon-workmgr-6558dfbfd8-85z9d\" (UID: \"4910ea1a-2750-4912-8fcb-8242e5118e32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" Apr 16 20:58:35.140823 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.140743 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/513ff303-5311-4cf5-8916-d5c41fdf0c3b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-c6499d69c-tj2hp\" (UID: \"513ff303-5311-4cf5-8916-d5c41fdf0c3b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c6499d69c-tj2hp" Apr 16 20:58:35.145257 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.145237 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wchjn\" (UniqueName: \"kubernetes.io/projected/4910ea1a-2750-4912-8fcb-8242e5118e32-kube-api-access-wchjn\") pod \"klusterlet-addon-workmgr-6558dfbfd8-85z9d\" (UID: \"4910ea1a-2750-4912-8fcb-8242e5118e32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" Apr 16 20:58:35.145420 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.145400 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drfq5\" (UniqueName: \"kubernetes.io/projected/513ff303-5311-4cf5-8916-d5c41fdf0c3b-kube-api-access-drfq5\") pod \"managed-serviceaccount-addon-agent-c6499d69c-tj2hp\" (UID: \"513ff303-5311-4cf5-8916-d5c41fdf0c3b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c6499d69c-tj2hp" Apr 16 20:58:35.238139 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.238069 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/69266b49-8b1f-4643-accb-e8a20922a94e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.238139 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.238111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/69266b49-8b1f-4643-accb-e8a20922a94e-hub\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.238272 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.238160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/69266b49-8b1f-4643-accb-e8a20922a94e-ca\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.238272 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.238195 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t974r\" (UniqueName: \"kubernetes.io/projected/69266b49-8b1f-4643-accb-e8a20922a94e-kube-api-access-t974r\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.238272 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.238223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/69266b49-8b1f-4643-accb-e8a20922a94e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.238272 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.238251 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/69266b49-8b1f-4643-accb-e8a20922a94e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.239018 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.238992 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/69266b49-8b1f-4643-accb-e8a20922a94e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.240344 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.240315 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/69266b49-8b1f-4643-accb-e8a20922a94e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.240497 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.240479 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/69266b49-8b1f-4643-accb-e8a20922a94e-hub\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.240549 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.240507 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/69266b49-8b1f-4643-accb-e8a20922a94e-ca\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.240666 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.240649 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/69266b49-8b1f-4643-accb-e8a20922a94e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.246372 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.246346 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t974r\" (UniqueName: \"kubernetes.io/projected/69266b49-8b1f-4643-accb-e8a20922a94e-kube-api-access-t974r\") pod \"cluster-proxy-proxy-agent-6769466bc4-ngbtk\" (UID: \"69266b49-8b1f-4643-accb-e8a20922a94e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.251258 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.251242 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c6499d69c-tj2hp" Apr 16 20:58:35.258836 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.258812 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" Apr 16 20:58:35.295114 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.295041 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 20:58:35.401706 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.401683 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c6499d69c-tj2hp"] Apr 16 20:58:35.404572 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:58:35.404546 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod513ff303_5311_4cf5_8916_d5c41fdf0c3b.slice/crio-ae8edf3cb142da5e027a200a7c2c7a156551ffd402b75862b1476e4037028b8a WatchSource:0}: Error finding container ae8edf3cb142da5e027a200a7c2c7a156551ffd402b75862b1476e4037028b8a: Status 404 returned error can't find the container with id ae8edf3cb142da5e027a200a7c2c7a156551ffd402b75862b1476e4037028b8a Apr 16 20:58:35.418931 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.418781 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d"] Apr 16 20:58:35.421467 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:58:35.421422 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4910ea1a_2750_4912_8fcb_8242e5118e32.slice/crio-e832e1cdbcc0ee8bf0ce5eccd713d7902c887ab2c83b60ffb8f3249657989a7d WatchSource:0}: Error finding container e832e1cdbcc0ee8bf0ce5eccd713d7902c887ab2c83b60ffb8f3249657989a7d: Status 404 returned error can't find the container with id e832e1cdbcc0ee8bf0ce5eccd713d7902c887ab2c83b60ffb8f3249657989a7d Apr 16 20:58:35.446418 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.446397 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk"] Apr 16 20:58:35.448700 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:58:35.448674 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69266b49_8b1f_4643_accb_e8a20922a94e.slice/crio-a3648c27680b29694e65a191d247d23e225ca03d5dc6d51267d4d751ce99cfd6 WatchSource:0}: Error finding container a3648c27680b29694e65a191d247d23e225ca03d5dc6d51267d4d751ce99cfd6: Status 404 returned error can't find the container with id a3648c27680b29694e65a191d247d23e225ca03d5dc6d51267d4d751ce99cfd6 Apr 16 20:58:35.907108 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.907069 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" event={"ID":"69266b49-8b1f-4643-accb-e8a20922a94e","Type":"ContainerStarted","Data":"a3648c27680b29694e65a191d247d23e225ca03d5dc6d51267d4d751ce99cfd6"} Apr 16 20:58:35.907964 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.907942 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" event={"ID":"4910ea1a-2750-4912-8fcb-8242e5118e32","Type":"ContainerStarted","Data":"e832e1cdbcc0ee8bf0ce5eccd713d7902c887ab2c83b60ffb8f3249657989a7d"} Apr 16 20:58:35.908759 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:35.908739 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c6499d69c-tj2hp" event={"ID":"513ff303-5311-4cf5-8916-d5c41fdf0c3b","Type":"ContainerStarted","Data":"ae8edf3cb142da5e027a200a7c2c7a156551ffd402b75862b1476e4037028b8a"} Apr 16 20:58:38.368609 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:38.368564 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret\") pod \"global-pull-secret-syncer-qhtpj\" (UID: \"90b5377e-36eb-4780-83e8-96ffc2917ab6\") " pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:38.371302 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:38.371267 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b5377e-36eb-4780-83e8-96ffc2917ab6-original-pull-secret\") pod \"global-pull-secret-syncer-qhtpj\" (UID: \"90b5377e-36eb-4780-83e8-96ffc2917ab6\") " pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:38.537526 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:38.537476 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhtpj" Apr 16 20:58:39.805413 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:39.805389 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qhtpj"] Apr 16 20:58:39.807876 ip-10-0-141-171 kubenswrapper[2575]: W0416 20:58:39.807851 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90b5377e_36eb_4780_83e8_96ffc2917ab6.slice/crio-ab2d4a4547b3b20836a4db535254a51ecec26628dbae628f3599e4aa1c98caa3 WatchSource:0}: Error finding container ab2d4a4547b3b20836a4db535254a51ecec26628dbae628f3599e4aa1c98caa3: Status 404 returned error can't find the container with id ab2d4a4547b3b20836a4db535254a51ecec26628dbae628f3599e4aa1c98caa3 Apr 16 20:58:39.921758 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:39.921680 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qhtpj" event={"ID":"90b5377e-36eb-4780-83e8-96ffc2917ab6","Type":"ContainerStarted","Data":"ab2d4a4547b3b20836a4db535254a51ecec26628dbae628f3599e4aa1c98caa3"} Apr 16 20:58:39.922796 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:39.922770 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" event={"ID":"69266b49-8b1f-4643-accb-e8a20922a94e","Type":"ContainerStarted","Data":"d752d9db7f7fd6459d1831848bca1262cef839f47075837b15ed4cb7169ff059"} Apr 16 20:58:39.924002 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:39.923973 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c6499d69c-tj2hp" event={"ID":"513ff303-5311-4cf5-8916-d5c41fdf0c3b","Type":"ContainerStarted","Data":"f59c3d82a6211d03f1ac692faacfb6819a12b5413a7d6a8786e7dcaecca1e5fe"} Apr 16 20:58:39.939816 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:39.939756 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c6499d69c-tj2hp" podStartSLOduration=1.658884287 podStartE2EDuration="5.939744079s" podCreationTimestamp="2026-04-16 20:58:34 +0000 UTC" firstStartedPulling="2026-04-16 20:58:35.407145573 +0000 UTC m=+43.196762568" lastFinishedPulling="2026-04-16 20:58:39.688005365 +0000 UTC m=+47.477622360" observedRunningTime="2026-04-16 20:58:39.938852763 +0000 UTC m=+47.728469778" watchObservedRunningTime="2026-04-16 20:58:39.939744079 +0000 UTC m=+47.729361109" Apr 16 20:58:40.485490 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:40.485424 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ck6cd\" (UID: \"7852b3f2-db76-4624-a66c-450474aeaa93\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 20:58:40.485659 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:40.485513 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:40.485659 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:40.485581 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:58:40.485790 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:40.485667 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert podName:7852b3f2-db76-4624-a66c-450474aeaa93 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:56.485644437 +0000 UTC m=+64.275261421 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ck6cd" (UID: "7852b3f2-db76-4624-a66c-450474aeaa93") : secret "networking-console-plugin-cert" not found Apr 16 20:58:40.485790 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:40.485700 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:58:40.485790 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:40.485720 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74c85f5b8b-5nkg7: secret "image-registry-tls" not found Apr 16 20:58:40.485790 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:40.485773 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls podName:81c67e7e-270f-4265-a65a-8caa7e5da99f nodeName:}" failed. No retries permitted until 2026-04-16 20:58:56.485755729 +0000 UTC m=+64.275372727 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls") pod "image-registry-74c85f5b8b-5nkg7" (UID: "81c67e7e-270f-4265-a65a-8caa7e5da99f") : secret "image-registry-tls" not found Apr 16 20:58:40.586413 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:40.586365 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:40.586608 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:40.586460 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert\") pod \"ingress-canary-72rld\" (UID: \"97e00ea1-e79f-4a6f-b820-0cafb65f4308\") " pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 20:58:40.586608 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:40.586521 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:40.586608 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:40.586590 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls podName:6767bcb0-f122-44f9-b378-3f18e741a065 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:56.586569806 +0000 UTC m=+64.376186789 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls") pod "dns-default-b7bcv" (UID: "6767bcb0-f122-44f9-b378-3f18e741a065") : secret "dns-default-metrics-tls" not found Apr 16 20:58:40.586608 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:40.586595 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:40.586827 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:40.586645 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert podName:97e00ea1-e79f-4a6f-b820-0cafb65f4308 nodeName:}" failed. No retries permitted until 2026-04-16 20:58:56.586628397 +0000 UTC m=+64.376245379 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert") pod "ingress-canary-72rld" (UID: "97e00ea1-e79f-4a6f-b820-0cafb65f4308") : secret "canary-serving-cert" not found Apr 16 20:58:42.932016 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:42.931963 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" event={"ID":"4910ea1a-2750-4912-8fcb-8242e5118e32","Type":"ContainerStarted","Data":"7228ae16db09b7ce4f7b10fac566f0b496a459b860f695c198f3e6296bf84546"} Apr 16 20:58:42.932598 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:42.932179 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" Apr 16 20:58:42.934249 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:42.934224 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" Apr 16 20:58:42.949565 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:42.949522 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" podStartSLOduration=2.233916383 podStartE2EDuration="8.949507221s" podCreationTimestamp="2026-04-16 20:58:34 +0000 UTC" firstStartedPulling="2026-04-16 20:58:35.423016728 +0000 UTC m=+43.212633714" lastFinishedPulling="2026-04-16 20:58:42.138607557 +0000 UTC m=+49.928224552" observedRunningTime="2026-04-16 20:58:42.948218116 +0000 UTC m=+50.737835134" watchObservedRunningTime="2026-04-16 20:58:42.949507221 +0000 UTC m=+50.739124225" Apr 16 20:58:45.939862 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:45.939823 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qhtpj" event={"ID":"90b5377e-36eb-4780-83e8-96ffc2917ab6","Type":"ContainerStarted","Data":"b5e2c2bebc8efa5cd9d280fd26b94870f4d7c97e2aa6924675af7a0e6bb0ed38"} Apr 16 20:58:45.941714 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:45.941690 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" event={"ID":"69266b49-8b1f-4643-accb-e8a20922a94e","Type":"ContainerStarted","Data":"061a350b0a0adf042a961829e42c80d5dd39115ef2e4d8aadff1c766ce941826"} Apr 16 20:58:45.941809 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:45.941719 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" event={"ID":"69266b49-8b1f-4643-accb-e8a20922a94e","Type":"ContainerStarted","Data":"0f14fedf4a83a2d59be43e5f0be7c6978aad19088bdb3e23e87fbefc61cf0cdd"} Apr 16 20:58:45.954921 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:45.954877 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qhtpj" podStartSLOduration=34.204198942 podStartE2EDuration="39.954866548s" podCreationTimestamp="2026-04-16 20:58:06 +0000 UTC" firstStartedPulling="2026-04-16 20:58:39.810249863 +0000 UTC m=+47.599866853" lastFinishedPulling="2026-04-16 20:58:45.560917462 +0000 UTC m=+53.350534459" observedRunningTime="2026-04-16 20:58:45.954186857 +0000 UTC m=+53.743803861" watchObservedRunningTime="2026-04-16 20:58:45.954866548 +0000 UTC m=+53.744483553" Apr 16 20:58:45.972631 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:45.972594 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" podStartSLOduration=1.857496001 podStartE2EDuration="11.972583181s" podCreationTimestamp="2026-04-16 20:58:34 +0000 UTC" firstStartedPulling="2026-04-16 20:58:35.450256093 +0000 UTC m=+43.239873080" lastFinishedPulling="2026-04-16 20:58:45.565343268 +0000 UTC m=+53.354960260" observedRunningTime="2026-04-16 20:58:45.971942388 +0000 UTC m=+53.761559396" watchObservedRunningTime="2026-04-16 20:58:45.972583181 +0000 UTC m=+53.762200186" Apr 16 20:58:52.880130 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:52.880103 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tzbn8" Apr 16 20:58:56.503401 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:56.503362 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ck6cd\" (UID: \"7852b3f2-db76-4624-a66c-450474aeaa93\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 20:58:56.503401 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:56.503403 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:58:56.503817 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:56.503519 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:58:56.503817 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:56.503595 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert podName:7852b3f2-db76-4624-a66c-450474aeaa93 nodeName:}" failed. No retries permitted until 2026-04-16 20:59:28.503579173 +0000 UTC m=+96.293196161 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ck6cd" (UID: "7852b3f2-db76-4624-a66c-450474aeaa93") : secret "networking-console-plugin-cert" not found Apr 16 20:58:56.503817 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:56.503523 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:58:56.503817 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:56.503624 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74c85f5b8b-5nkg7: secret "image-registry-tls" not found Apr 16 20:58:56.503817 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:56.503674 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls podName:81c67e7e-270f-4265-a65a-8caa7e5da99f nodeName:}" failed. No retries permitted until 2026-04-16 20:59:28.503661951 +0000 UTC m=+96.293278934 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls") pod "image-registry-74c85f5b8b-5nkg7" (UID: "81c67e7e-270f-4265-a65a-8caa7e5da99f") : secret "image-registry-tls" not found Apr 16 20:58:56.603829 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:56.603801 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:58:56.603929 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:56.603845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert\") pod \"ingress-canary-72rld\" (UID: \"97e00ea1-e79f-4a6f-b820-0cafb65f4308\") " pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 20:58:56.603962 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:56.603933 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:58:56.603962 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:56.603939 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:58:56.604023 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:56.603986 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert podName:97e00ea1-e79f-4a6f-b820-0cafb65f4308 nodeName:}" failed. No retries permitted until 2026-04-16 20:59:28.603970611 +0000 UTC m=+96.393587594 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert") pod "ingress-canary-72rld" (UID: "97e00ea1-e79f-4a6f-b820-0cafb65f4308") : secret "canary-serving-cert" not found Apr 16 20:58:56.604023 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:56.604000 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls podName:6767bcb0-f122-44f9-b378-3f18e741a065 nodeName:}" failed. No retries permitted until 2026-04-16 20:59:28.603993954 +0000 UTC m=+96.393610937 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls") pod "dns-default-b7bcv" (UID: "6767bcb0-f122-44f9-b378-3f18e741a065") : secret "dns-default-metrics-tls" not found Apr 16 20:58:58.517562 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:58:58.517524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs\") pod \"network-metrics-daemon-zbj49\" (UID: \"85e4090a-8dbc-4412-b20a-23d79d838363\") " pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 20:58:58.518088 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:58.517697 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:58:58.518088 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:58:58.517777 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs podName:85e4090a-8dbc-4412-b20a-23d79d838363 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:02.517756159 +0000 UTC m=+130.307373143 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs") pod "network-metrics-daemon-zbj49" (UID: "85e4090a-8dbc-4412-b20a-23d79d838363") : secret "metrics-daemon-secret" not found Apr 16 20:59:03.905711 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:59:03.905681 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-nngg9" Apr 16 20:59:28.543076 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:59:28.543030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ck6cd\" (UID: \"7852b3f2-db76-4624-a66c-450474aeaa93\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 20:59:28.543076 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:59:28.543076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 20:59:28.543628 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:59:28.543179 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:59:28.543628 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:59:28.543191 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74c85f5b8b-5nkg7: secret "image-registry-tls" not found Apr 16 20:59:28.543628 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:59:28.543235 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls podName:81c67e7e-270f-4265-a65a-8caa7e5da99f nodeName:}" failed. No retries permitted until 2026-04-16 21:00:32.54322255 +0000 UTC m=+160.332839533 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls") pod "image-registry-74c85f5b8b-5nkg7" (UID: "81c67e7e-270f-4265-a65a-8caa7e5da99f") : secret "image-registry-tls" not found Apr 16 20:59:28.543628 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:59:28.543178 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:59:28.543628 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:59:28.543321 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert podName:7852b3f2-db76-4624-a66c-450474aeaa93 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:32.543306067 +0000 UTC m=+160.332923051 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ck6cd" (UID: "7852b3f2-db76-4624-a66c-450474aeaa93") : secret "networking-console-plugin-cert" not found Apr 16 20:59:28.644340 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:59:28.644302 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 20:59:28.644519 ip-10-0-141-171 kubenswrapper[2575]: I0416 20:59:28.644356 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert\") pod \"ingress-canary-72rld\" (UID: \"97e00ea1-e79f-4a6f-b820-0cafb65f4308\") " pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 20:59:28.644519 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:59:28.644473 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:59:28.644519 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:59:28.644475 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:59:28.644619 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:59:28.644526 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert podName:97e00ea1-e79f-4a6f-b820-0cafb65f4308 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:32.644510515 +0000 UTC m=+160.434127498 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert") pod "ingress-canary-72rld" (UID: "97e00ea1-e79f-4a6f-b820-0cafb65f4308") : secret "canary-serving-cert" not found Apr 16 20:59:28.644619 ip-10-0-141-171 kubenswrapper[2575]: E0416 20:59:28.644540 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls podName:6767bcb0-f122-44f9-b378-3f18e741a065 nodeName:}" failed. No retries permitted until 2026-04-16 21:00:32.644532384 +0000 UTC m=+160.434149366 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls") pod "dns-default-b7bcv" (UID: "6767bcb0-f122-44f9-b378-3f18e741a065") : secret "dns-default-metrics-tls" not found Apr 16 21:00:02.601968 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:02.601913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs\") pod \"network-metrics-daemon-zbj49\" (UID: \"85e4090a-8dbc-4412-b20a-23d79d838363\") " pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 21:00:02.602499 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:00:02.602051 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 21:00:02.602499 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:00:02.602120 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs podName:85e4090a-8dbc-4412-b20a-23d79d838363 nodeName:}" failed. No retries permitted until 2026-04-16 21:02:04.602104573 +0000 UTC m=+252.391721555 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs") pod "network-metrics-daemon-zbj49" (UID: "85e4090a-8dbc-4412-b20a-23d79d838363") : secret "metrics-daemon-secret" not found Apr 16 21:00:24.880412 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:24.880380 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tzlx6_92c3499b-a30d-4470-8b98-4e3a3b91de06/dns-node-resolver/0.log" Apr 16 21:00:25.681010 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:25.680978 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n7hjd_63a48313-4beb-4c3e-89bb-bddcb5b76d5a/node-ca/0.log" Apr 16 21:00:27.677172 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:00:27.677120 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" podUID="7852b3f2-db76-4624-a66c-450474aeaa93" Apr 16 21:00:27.688412 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:00:27.688383 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" podUID="81c67e7e-270f-4265-a65a-8caa7e5da99f" Apr 16 21:00:27.706590 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:00:27.706566 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-72rld" podUID="97e00ea1-e79f-4a6f-b820-0cafb65f4308" Apr 16 21:00:27.732610 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:00:27.732587 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-zbj49" podUID="85e4090a-8dbc-4412-b20a-23d79d838363" Apr 16 21:00:27.786643 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:00:27.786620 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-b7bcv" podUID="6767bcb0-f122-44f9-b378-3f18e741a065" Apr 16 21:00:28.181387 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:28.181359 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 21:00:28.181387 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:28.181376 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-b7bcv" Apr 16 21:00:28.181629 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:28.181359 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 21:00:32.619983 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:32.619945 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ck6cd\" (UID: \"7852b3f2-db76-4624-a66c-450474aeaa93\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 21:00:32.619983 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:32.619985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls\") pod \"image-registry-74c85f5b8b-5nkg7\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 21:00:32.620477 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:00:32.620079 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 21:00:32.620477 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:00:32.620092 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74c85f5b8b-5nkg7: secret "image-registry-tls" not found Apr 16 21:00:32.620477 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:00:32.620142 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls podName:81c67e7e-270f-4265-a65a-8caa7e5da99f nodeName:}" failed. No retries permitted until 2026-04-16 21:02:34.62012785 +0000 UTC m=+282.409744832 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls") pod "image-registry-74c85f5b8b-5nkg7" (UID: "81c67e7e-270f-4265-a65a-8caa7e5da99f") : secret "image-registry-tls" not found Apr 16 21:00:32.620477 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:00:32.620080 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 21:00:32.620477 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:00:32.620200 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert podName:7852b3f2-db76-4624-a66c-450474aeaa93 nodeName:}" failed. No retries permitted until 2026-04-16 21:02:34.620186856 +0000 UTC m=+282.409803838 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ck6cd" (UID: "7852b3f2-db76-4624-a66c-450474aeaa93") : secret "networking-console-plugin-cert" not found Apr 16 21:00:32.720882 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:32.720835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 21:00:32.721043 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:00:32.720966 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 21:00:32.721043 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:32.720989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert\") pod \"ingress-canary-72rld\" (UID: \"97e00ea1-e79f-4a6f-b820-0cafb65f4308\") " pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 21:00:32.721043 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:00:32.721024 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls podName:6767bcb0-f122-44f9-b378-3f18e741a065 nodeName:}" failed. No retries permitted until 2026-04-16 21:02:34.721010126 +0000 UTC m=+282.510627108 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls") pod "dns-default-b7bcv" (UID: "6767bcb0-f122-44f9-b378-3f18e741a065") : secret "dns-default-metrics-tls" not found Apr 16 21:00:32.721198 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:00:32.721067 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 21:00:32.721198 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:00:32.721111 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert podName:97e00ea1-e79f-4a6f-b820-0cafb65f4308 nodeName:}" failed. No retries permitted until 2026-04-16 21:02:34.721100221 +0000 UTC m=+282.510717203 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert") pod "ingress-canary-72rld" (UID: "97e00ea1-e79f-4a6f-b820-0cafb65f4308") : secret "canary-serving-cert" not found Apr 16 21:00:40.208515 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:40.208480 2575 generic.go:358] "Generic (PLEG): container finished" podID="513ff303-5311-4cf5-8916-d5c41fdf0c3b" containerID="f59c3d82a6211d03f1ac692faacfb6819a12b5413a7d6a8786e7dcaecca1e5fe" exitCode=255 Apr 16 21:00:40.208970 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:40.208529 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c6499d69c-tj2hp" event={"ID":"513ff303-5311-4cf5-8916-d5c41fdf0c3b","Type":"ContainerDied","Data":"f59c3d82a6211d03f1ac692faacfb6819a12b5413a7d6a8786e7dcaecca1e5fe"} Apr 16 21:00:40.208970 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:40.208883 2575 scope.go:117] "RemoveContainer" containerID="f59c3d82a6211d03f1ac692faacfb6819a12b5413a7d6a8786e7dcaecca1e5fe" Apr 16 21:00:41.212609 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:41.212567 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c6499d69c-tj2hp" event={"ID":"513ff303-5311-4cf5-8916-d5c41fdf0c3b","Type":"ContainerStarted","Data":"0df3b91a54dc0e4df40829dca7457a6d9c48b2efd837d4f898b02adfedd98021"} Apr 16 21:00:41.714750 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:41.714723 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 21:00:42.716942 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:42.716912 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 21:00:42.932804 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:42.932745 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" podUID="4910ea1a-2750-4912-8fcb-8242e5118e32" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.11:8000/readyz\": dial tcp 10.132.0.11:8000: connect: connection refused" Apr 16 21:00:43.218288 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:43.218256 2575 generic.go:358] "Generic (PLEG): container finished" podID="4910ea1a-2750-4912-8fcb-8242e5118e32" containerID="7228ae16db09b7ce4f7b10fac566f0b496a459b860f695c198f3e6296bf84546" exitCode=1 Apr 16 21:00:43.218481 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:43.218296 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" event={"ID":"4910ea1a-2750-4912-8fcb-8242e5118e32","Type":"ContainerDied","Data":"7228ae16db09b7ce4f7b10fac566f0b496a459b860f695c198f3e6296bf84546"} Apr 16 21:00:43.218656 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:43.218640 2575 scope.go:117] "RemoveContainer" containerID="7228ae16db09b7ce4f7b10fac566f0b496a459b860f695c198f3e6296bf84546" Apr 16 21:00:44.221910 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:44.221876 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" event={"ID":"4910ea1a-2750-4912-8fcb-8242e5118e32","Type":"ContainerStarted","Data":"f6bb3e5e5afa61bdce6a006916be7bd2bdde15799957272197435d0bca91e7d2"} Apr 16 21:00:44.222293 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:44.222179 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" Apr 16 21:00:44.222757 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:44.222740 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6558dfbfd8-85z9d" Apr 16 21:00:50.843058 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:50.843013 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-pzkf8"] Apr 16 21:00:50.846132 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:50.846110 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pzkf8" Apr 16 21:00:50.851796 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:50.851778 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 21:00:50.851903 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:50.851823 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 21:00:50.852576 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:50.852408 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 21:00:50.852576 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:50.852466 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 21:00:50.853391 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:50.853371 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-b7pz7\"" Apr 16 21:00:50.862746 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:50.862721 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pzkf8"] Apr 16 21:00:50.960490 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:50.960457 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d4e21d83-3fa0-4b03-80a5-9daa14fbf570-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pzkf8\" (UID: \"d4e21d83-3fa0-4b03-80a5-9daa14fbf570\") " pod="openshift-insights/insights-runtime-extractor-pzkf8" Apr 16 21:00:50.960629 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:50.960535 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d4e21d83-3fa0-4b03-80a5-9daa14fbf570-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pzkf8\" (UID: \"d4e21d83-3fa0-4b03-80a5-9daa14fbf570\") " pod="openshift-insights/insights-runtime-extractor-pzkf8" Apr 16 21:00:50.960629 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:50.960559 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4e21d83-3fa0-4b03-80a5-9daa14fbf570-data-volume\") pod \"insights-runtime-extractor-pzkf8\" (UID: \"d4e21d83-3fa0-4b03-80a5-9daa14fbf570\") " pod="openshift-insights/insights-runtime-extractor-pzkf8" Apr 16 21:00:50.960629 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:50.960576 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d4e21d83-3fa0-4b03-80a5-9daa14fbf570-crio-socket\") pod \"insights-runtime-extractor-pzkf8\" (UID: \"d4e21d83-3fa0-4b03-80a5-9daa14fbf570\") " pod="openshift-insights/insights-runtime-extractor-pzkf8" Apr 16 21:00:50.960629 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:50.960616 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgmzv\" (UniqueName: \"kubernetes.io/projected/d4e21d83-3fa0-4b03-80a5-9daa14fbf570-kube-api-access-wgmzv\") pod \"insights-runtime-extractor-pzkf8\" (UID: \"d4e21d83-3fa0-4b03-80a5-9daa14fbf570\") " pod="openshift-insights/insights-runtime-extractor-pzkf8" Apr 16 21:00:51.061080 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:51.061049 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgmzv\" (UniqueName: \"kubernetes.io/projected/d4e21d83-3fa0-4b03-80a5-9daa14fbf570-kube-api-access-wgmzv\") pod \"insights-runtime-extractor-pzkf8\" (UID: \"d4e21d83-3fa0-4b03-80a5-9daa14fbf570\") " pod="openshift-insights/insights-runtime-extractor-pzkf8" Apr 16 21:00:51.061080 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:51.061084 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d4e21d83-3fa0-4b03-80a5-9daa14fbf570-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pzkf8\" (UID: \"d4e21d83-3fa0-4b03-80a5-9daa14fbf570\") " pod="openshift-insights/insights-runtime-extractor-pzkf8" Apr 16 21:00:51.061268 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:51.061139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d4e21d83-3fa0-4b03-80a5-9daa14fbf570-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pzkf8\" (UID: \"d4e21d83-3fa0-4b03-80a5-9daa14fbf570\") " pod="openshift-insights/insights-runtime-extractor-pzkf8" Apr 16 21:00:51.061268 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:51.061161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4e21d83-3fa0-4b03-80a5-9daa14fbf570-data-volume\") pod \"insights-runtime-extractor-pzkf8\" (UID: \"d4e21d83-3fa0-4b03-80a5-9daa14fbf570\") " pod="openshift-insights/insights-runtime-extractor-pzkf8" Apr 16 21:00:51.061268 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:51.061177 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d4e21d83-3fa0-4b03-80a5-9daa14fbf570-crio-socket\") pod \"insights-runtime-extractor-pzkf8\" (UID: \"d4e21d83-3fa0-4b03-80a5-9daa14fbf570\") " pod="openshift-insights/insights-runtime-extractor-pzkf8" Apr 16 21:00:51.061375 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:51.061284 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d4e21d83-3fa0-4b03-80a5-9daa14fbf570-crio-socket\") pod \"insights-runtime-extractor-pzkf8\" (UID: \"d4e21d83-3fa0-4b03-80a5-9daa14fbf570\") " pod="openshift-insights/insights-runtime-extractor-pzkf8" Apr 16 21:00:51.061571 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:51.061554 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4e21d83-3fa0-4b03-80a5-9daa14fbf570-data-volume\") pod \"insights-runtime-extractor-pzkf8\" (UID: \"d4e21d83-3fa0-4b03-80a5-9daa14fbf570\") " pod="openshift-insights/insights-runtime-extractor-pzkf8" Apr 16 21:00:51.061763 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:51.061743 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d4e21d83-3fa0-4b03-80a5-9daa14fbf570-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pzkf8\" (UID: \"d4e21d83-3fa0-4b03-80a5-9daa14fbf570\") " pod="openshift-insights/insights-runtime-extractor-pzkf8" Apr 16 21:00:51.063483 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:51.063463 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d4e21d83-3fa0-4b03-80a5-9daa14fbf570-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pzkf8\" (UID: \"d4e21d83-3fa0-4b03-80a5-9daa14fbf570\") " pod="openshift-insights/insights-runtime-extractor-pzkf8" Apr 16 21:00:51.077794 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:51.077769 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgmzv\" (UniqueName: \"kubernetes.io/projected/d4e21d83-3fa0-4b03-80a5-9daa14fbf570-kube-api-access-wgmzv\") pod \"insights-runtime-extractor-pzkf8\" (UID: \"d4e21d83-3fa0-4b03-80a5-9daa14fbf570\") " pod="openshift-insights/insights-runtime-extractor-pzkf8" Apr 16 21:00:51.155107 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:51.155038 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pzkf8" Apr 16 21:00:51.278024 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:51.278003 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pzkf8"] Apr 16 21:00:51.280212 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:00:51.280188 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4e21d83_3fa0_4b03_80a5_9daa14fbf570.slice/crio-df293a5c6d2d2f89d8a7272fe28f18cedca539316162113f5865eeb66ed27fa2 WatchSource:0}: Error finding container df293a5c6d2d2f89d8a7272fe28f18cedca539316162113f5865eeb66ed27fa2: Status 404 returned error can't find the container with id df293a5c6d2d2f89d8a7272fe28f18cedca539316162113f5865eeb66ed27fa2 Apr 16 21:00:52.242555 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:52.242528 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pzkf8" event={"ID":"d4e21d83-3fa0-4b03-80a5-9daa14fbf570","Type":"ContainerStarted","Data":"b5ecf1a992e294301889c2fd6ad418507ed0c887a48a0e93d6c53a9f060d610c"} Apr 16 21:00:52.242841 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:52.242560 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pzkf8" event={"ID":"d4e21d83-3fa0-4b03-80a5-9daa14fbf570","Type":"ContainerStarted","Data":"34d8eb33d2e4fde91f99b6bea6313a2e668d42122b62b91d8b4116d8bdba5e00"} Apr 16 21:00:52.242841 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:52.242572 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pzkf8" event={"ID":"d4e21d83-3fa0-4b03-80a5-9daa14fbf570","Type":"ContainerStarted","Data":"df293a5c6d2d2f89d8a7272fe28f18cedca539316162113f5865eeb66ed27fa2"} Apr 16 21:00:54.250346 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:54.250306 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pzkf8" event={"ID":"d4e21d83-3fa0-4b03-80a5-9daa14fbf570","Type":"ContainerStarted","Data":"060f13bd813cd63ae1d400fcd6bed755a516f9ab403b4a4eab4f7e8efde8aa57"} Apr 16 21:00:54.269539 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:00:54.269499 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-pzkf8" podStartSLOduration=1.853573763 podStartE2EDuration="4.269485406s" podCreationTimestamp="2026-04-16 21:00:50 +0000 UTC" firstStartedPulling="2026-04-16 21:00:51.33241922 +0000 UTC m=+179.122036202" lastFinishedPulling="2026-04-16 21:00:53.748330862 +0000 UTC m=+181.537947845" observedRunningTime="2026-04-16 21:00:54.268855082 +0000 UTC m=+182.058472087" watchObservedRunningTime="2026-04-16 21:00:54.269485406 +0000 UTC m=+182.059102411" Apr 16 21:01:03.211943 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.211907 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-h5bnx"] Apr 16 21:01:03.215347 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.215328 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.217957 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.217936 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 21:01:03.218075 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.218015 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-l4vt2\"" Apr 16 21:01:03.218126 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.218078 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 21:01:03.218521 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.218501 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 21:01:03.218619 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.218602 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 21:01:03.219465 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.219448 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 21:01:03.219465 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.219463 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 21:01:03.255069 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.255043 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-tls\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.255181 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.255094 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-sys\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.255181 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.255120 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-wtmp\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.255181 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.255143 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-textfile\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.255291 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.255241 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.255291 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.255276 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-metrics-client-ca\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.255355 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.255298 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxjpn\" (UniqueName: \"kubernetes.io/projected/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-kube-api-access-qxjpn\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.255355 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.255348 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-root\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.255433 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.255376 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-accelerators-collector-config\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.356158 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.356121 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-metrics-client-ca\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.356158 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.356160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxjpn\" (UniqueName: \"kubernetes.io/projected/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-kube-api-access-qxjpn\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.356369 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.356199 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-root\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.356369 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.356254 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-accelerators-collector-config\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.356369 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.356287 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-tls\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.356369 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.356328 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-root\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.356369 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.356346 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-sys\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.356647 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.356387 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-wtmp\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.356647 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.356415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-textfile\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.356647 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.356420 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-sys\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.356647 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:01:03.356460 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 21:01:03.356647 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.356506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.356647 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:01:03.356521 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-tls podName:e84b920a-4041-4dcf-a3ee-2e9e4187bfe7 nodeName:}" failed. No retries permitted until 2026-04-16 21:01:03.85650162 +0000 UTC m=+191.646118611 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-tls") pod "node-exporter-h5bnx" (UID: "e84b920a-4041-4dcf-a3ee-2e9e4187bfe7") : secret "node-exporter-tls" not found Apr 16 21:01:03.356647 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.356552 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-wtmp\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.356962 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.356760 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-textfile\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.356962 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.356818 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-accelerators-collector-config\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.356962 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.356847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-metrics-client-ca\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.358739 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.358720 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.365199 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.365176 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxjpn\" (UniqueName: \"kubernetes.io/projected/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-kube-api-access-qxjpn\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.860652 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.860612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-tls\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:03.862926 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:03.862905 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e84b920a-4041-4dcf-a3ee-2e9e4187bfe7-node-exporter-tls\") pod \"node-exporter-h5bnx\" (UID: \"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7\") " pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:04.124103 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:04.124024 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h5bnx" Apr 16 21:01:04.131786 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:01:04.131756 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode84b920a_4041_4dcf_a3ee_2e9e4187bfe7.slice/crio-5236ad16e481b4a49f6d680265b97fc79cb2ce46ab57231bd66ac2f23d6b0a35 WatchSource:0}: Error finding container 5236ad16e481b4a49f6d680265b97fc79cb2ce46ab57231bd66ac2f23d6b0a35: Status 404 returned error can't find the container with id 5236ad16e481b4a49f6d680265b97fc79cb2ce46ab57231bd66ac2f23d6b0a35 Apr 16 21:01:04.275509 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:04.275466 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h5bnx" event={"ID":"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7","Type":"ContainerStarted","Data":"5236ad16e481b4a49f6d680265b97fc79cb2ce46ab57231bd66ac2f23d6b0a35"} Apr 16 21:01:05.278683 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:05.278645 2575 generic.go:358] "Generic (PLEG): container finished" podID="e84b920a-4041-4dcf-a3ee-2e9e4187bfe7" containerID="37a62517e7dc14304c01a774f4984fdd22a3480792de2821f8f80164fb9b6e4f" exitCode=0 Apr 16 21:01:05.279022 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:05.278715 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h5bnx" event={"ID":"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7","Type":"ContainerDied","Data":"37a62517e7dc14304c01a774f4984fdd22a3480792de2821f8f80164fb9b6e4f"} Apr 16 21:01:06.284663 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:06.284623 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h5bnx" event={"ID":"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7","Type":"ContainerStarted","Data":"217dbb17c460bc5ce1e818d9daa5320328c6d8ec87e8e195cdf49f7f361889ca"} Apr 16 21:01:06.284663 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:06.284669 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h5bnx" event={"ID":"e84b920a-4041-4dcf-a3ee-2e9e4187bfe7","Type":"ContainerStarted","Data":"31b48f02b974885c0c4bfe58b7845b8239ee951e9fd15031f873c5b54e5ab370"} Apr 16 21:01:06.306170 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:06.306126 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-h5bnx" podStartSLOduration=2.612792273 podStartE2EDuration="3.306114487s" podCreationTimestamp="2026-04-16 21:01:03 +0000 UTC" firstStartedPulling="2026-04-16 21:01:04.133654628 +0000 UTC m=+191.923271611" lastFinishedPulling="2026-04-16 21:01:04.826976843 +0000 UTC m=+192.616593825" observedRunningTime="2026-04-16 21:01:06.304540782 +0000 UTC m=+194.094157787" watchObservedRunningTime="2026-04-16 21:01:06.306114487 +0000 UTC m=+194.095731491" Apr 16 21:01:12.915173 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:12.915137 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-74c85f5b8b-5nkg7"] Apr 16 21:01:12.915539 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:01:12.915330 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" podUID="81c67e7e-270f-4265-a65a-8caa7e5da99f" Apr 16 21:01:13.302272 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.302178 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 21:01:13.306286 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.306263 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 21:01:13.335588 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.335558 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81c67e7e-270f-4265-a65a-8caa7e5da99f-trusted-ca\") pod \"81c67e7e-270f-4265-a65a-8caa7e5da99f\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " Apr 16 21:01:13.335726 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.335604 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-bound-sa-token\") pod \"81c67e7e-270f-4265-a65a-8caa7e5da99f\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " Apr 16 21:01:13.335726 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.335645 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81c67e7e-270f-4265-a65a-8caa7e5da99f-ca-trust-extracted\") pod \"81c67e7e-270f-4265-a65a-8caa7e5da99f\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " Apr 16 21:01:13.335726 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.335677 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gvbw\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-kube-api-access-4gvbw\") pod \"81c67e7e-270f-4265-a65a-8caa7e5da99f\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " Apr 16 21:01:13.335726 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.335714 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/81c67e7e-270f-4265-a65a-8caa7e5da99f-image-registry-private-configuration\") pod \"81c67e7e-270f-4265-a65a-8caa7e5da99f\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " Apr 16 21:01:13.335885 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.335751 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-certificates\") pod \"81c67e7e-270f-4265-a65a-8caa7e5da99f\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " Apr 16 21:01:13.335885 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.335785 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81c67e7e-270f-4265-a65a-8caa7e5da99f-installation-pull-secrets\") pod \"81c67e7e-270f-4265-a65a-8caa7e5da99f\" (UID: \"81c67e7e-270f-4265-a65a-8caa7e5da99f\") " Apr 16 21:01:13.336100 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.336071 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81c67e7e-270f-4265-a65a-8caa7e5da99f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "81c67e7e-270f-4265-a65a-8caa7e5da99f" (UID: "81c67e7e-270f-4265-a65a-8caa7e5da99f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:01:13.336100 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.335987 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81c67e7e-270f-4265-a65a-8caa7e5da99f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "81c67e7e-270f-4265-a65a-8caa7e5da99f" (UID: "81c67e7e-270f-4265-a65a-8caa7e5da99f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:01:13.336270 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.336240 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "81c67e7e-270f-4265-a65a-8caa7e5da99f" (UID: "81c67e7e-270f-4265-a65a-8caa7e5da99f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:01:13.337957 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.337926 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-kube-api-access-4gvbw" (OuterVolumeSpecName: "kube-api-access-4gvbw") pod "81c67e7e-270f-4265-a65a-8caa7e5da99f" (UID: "81c67e7e-270f-4265-a65a-8caa7e5da99f"). InnerVolumeSpecName "kube-api-access-4gvbw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:01:13.338054 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.337991 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "81c67e7e-270f-4265-a65a-8caa7e5da99f" (UID: "81c67e7e-270f-4265-a65a-8caa7e5da99f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:01:13.338054 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.338025 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81c67e7e-270f-4265-a65a-8caa7e5da99f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "81c67e7e-270f-4265-a65a-8caa7e5da99f" (UID: "81c67e7e-270f-4265-a65a-8caa7e5da99f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 21:01:13.338136 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.338120 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81c67e7e-270f-4265-a65a-8caa7e5da99f-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "81c67e7e-270f-4265-a65a-8caa7e5da99f" (UID: "81c67e7e-270f-4265-a65a-8caa7e5da99f"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 21:01:13.437296 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.437261 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81c67e7e-270f-4265-a65a-8caa7e5da99f-ca-trust-extracted\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:01:13.437296 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.437291 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4gvbw\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-kube-api-access-4gvbw\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:01:13.437296 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.437301 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/81c67e7e-270f-4265-a65a-8caa7e5da99f-image-registry-private-configuration\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:01:13.437547 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.437311 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-certificates\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:01:13.437547 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.437322 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81c67e7e-270f-4265-a65a-8caa7e5da99f-installation-pull-secrets\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:01:13.437547 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.437331 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81c67e7e-270f-4265-a65a-8caa7e5da99f-trusted-ca\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:01:13.437547 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:13.437339 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-bound-sa-token\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:01:14.304183 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:14.304152 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74c85f5b8b-5nkg7" Apr 16 21:01:14.347332 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:14.347305 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-74c85f5b8b-5nkg7"] Apr 16 21:01:14.350958 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:14.350937 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-74c85f5b8b-5nkg7"] Apr 16 21:01:14.446652 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:14.446620 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81c67e7e-270f-4265-a65a-8caa7e5da99f-registry-tls\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:01:14.718322 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:14.718294 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81c67e7e-270f-4265-a65a-8caa7e5da99f" path="/var/lib/kubelet/pods/81c67e7e-270f-4265-a65a-8caa7e5da99f/volumes" Apr 16 21:01:15.296493 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:15.296431 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" podUID="69266b49-8b1f-4643-accb-e8a20922a94e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 21:01:25.296700 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:25.296656 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" podUID="69266b49-8b1f-4643-accb-e8a20922a94e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 21:01:35.296927 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:35.296881 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" podUID="69266b49-8b1f-4643-accb-e8a20922a94e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 21:01:35.297376 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:35.296965 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" Apr 16 21:01:35.297482 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:35.297422 2575 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"061a350b0a0adf042a961829e42c80d5dd39115ef2e4d8aadff1c766ce941826"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 21:01:35.297524 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:35.297512 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" podUID="69266b49-8b1f-4643-accb-e8a20922a94e" containerName="service-proxy" containerID="cri-o://061a350b0a0adf042a961829e42c80d5dd39115ef2e4d8aadff1c766ce941826" gracePeriod=30 Apr 16 21:01:36.356185 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:36.356151 2575 generic.go:358] "Generic (PLEG): container finished" podID="69266b49-8b1f-4643-accb-e8a20922a94e" containerID="061a350b0a0adf042a961829e42c80d5dd39115ef2e4d8aadff1c766ce941826" exitCode=2 Apr 16 21:01:36.356596 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:36.356223 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" event={"ID":"69266b49-8b1f-4643-accb-e8a20922a94e","Type":"ContainerDied","Data":"061a350b0a0adf042a961829e42c80d5dd39115ef2e4d8aadff1c766ce941826"} Apr 16 21:01:36.356596 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:01:36.356263 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6769466bc4-ngbtk" event={"ID":"69266b49-8b1f-4643-accb-e8a20922a94e","Type":"ContainerStarted","Data":"0c25c599274f9ad00553a1218c9e866def8c7519da5ca1790d07b85289db25dc"} Apr 16 21:02:04.633879 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:04.633819 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs\") pod \"network-metrics-daemon-zbj49\" (UID: \"85e4090a-8dbc-4412-b20a-23d79d838363\") " pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 21:02:04.636145 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:04.636123 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85e4090a-8dbc-4412-b20a-23d79d838363-metrics-certs\") pod \"network-metrics-daemon-zbj49\" (UID: \"85e4090a-8dbc-4412-b20a-23d79d838363\") " pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 21:02:04.920759 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:04.920725 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lvzr6\"" Apr 16 21:02:04.928237 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:04.928210 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zbj49" Apr 16 21:02:05.040643 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:05.040557 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zbj49"] Apr 16 21:02:05.042617 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:02:05.042584 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85e4090a_8dbc_4412_b20a_23d79d838363.slice/crio-5ca074fab84be571deeed66c215e6006965723b1110d415fbae7a2da1f77fad5 WatchSource:0}: Error finding container 5ca074fab84be571deeed66c215e6006965723b1110d415fbae7a2da1f77fad5: Status 404 returned error can't find the container with id 5ca074fab84be571deeed66c215e6006965723b1110d415fbae7a2da1f77fad5 Apr 16 21:02:05.425493 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:05.425462 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zbj49" event={"ID":"85e4090a-8dbc-4412-b20a-23d79d838363","Type":"ContainerStarted","Data":"5ca074fab84be571deeed66c215e6006965723b1110d415fbae7a2da1f77fad5"} Apr 16 21:02:06.429387 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:06.429353 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zbj49" event={"ID":"85e4090a-8dbc-4412-b20a-23d79d838363","Type":"ContainerStarted","Data":"59455f3c527ba07ebacd9f8d291692ea2013c38ec49f94fbb805e8cfa5591a48"} Apr 16 21:02:06.429859 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:06.429392 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zbj49" event={"ID":"85e4090a-8dbc-4412-b20a-23d79d838363","Type":"ContainerStarted","Data":"0345328adda4d01fb9169dacb1302756d60ae26b081fd63357fd4b6ecafad98b"} Apr 16 21:02:06.446712 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:06.446640 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zbj49" podStartSLOduration=253.304014117 podStartE2EDuration="4m14.446626575s" podCreationTimestamp="2026-04-16 20:57:52 +0000 UTC" firstStartedPulling="2026-04-16 21:02:05.044413051 +0000 UTC m=+252.834030033" lastFinishedPulling="2026-04-16 21:02:06.187025507 +0000 UTC m=+253.976642491" observedRunningTime="2026-04-16 21:02:06.446232907 +0000 UTC m=+254.235849909" watchObservedRunningTime="2026-04-16 21:02:06.446626575 +0000 UTC m=+254.236243580" Apr 16 21:02:31.182400 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:02:31.182362 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-b7bcv" podUID="6767bcb0-f122-44f9-b378-3f18e741a065" Apr 16 21:02:31.182400 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:02:31.182362 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" podUID="7852b3f2-db76-4624-a66c-450474aeaa93" Apr 16 21:02:31.494350 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:31.494264 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 21:02:31.494350 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:31.494312 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-b7bcv" Apr 16 21:02:34.636405 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:34.636355 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ck6cd\" (UID: \"7852b3f2-db76-4624-a66c-450474aeaa93\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 21:02:34.638675 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:34.638654 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7852b3f2-db76-4624-a66c-450474aeaa93-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ck6cd\" (UID: \"7852b3f2-db76-4624-a66c-450474aeaa93\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 21:02:34.737326 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:34.737292 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 21:02:34.737536 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:34.737334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert\") pod \"ingress-canary-72rld\" (UID: \"97e00ea1-e79f-4a6f-b820-0cafb65f4308\") " pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 21:02:34.739649 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:34.739618 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6767bcb0-f122-44f9-b378-3f18e741a065-metrics-tls\") pod \"dns-default-b7bcv\" (UID: \"6767bcb0-f122-44f9-b378-3f18e741a065\") " pod="openshift-dns/dns-default-b7bcv" Apr 16 21:02:34.739772 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:34.739708 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97e00ea1-e79f-4a6f-b820-0cafb65f4308-cert\") pod \"ingress-canary-72rld\" (UID: \"97e00ea1-e79f-4a6f-b820-0cafb65f4308\") " pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 21:02:34.798753 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:34.798718 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kfmp4\"" Apr 16 21:02:34.799727 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:34.799710 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-ggv4b\"" Apr 16 21:02:34.805782 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:34.805762 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" Apr 16 21:02:34.805882 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:34.805789 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-b7bcv" Apr 16 21:02:34.818709 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:34.818682 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mhzpf\"" Apr 16 21:02:34.826496 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:34.826467 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-72rld" Apr 16 21:02:34.966833 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:34.966804 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-b7bcv"] Apr 16 21:02:34.971431 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:02:34.971401 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6767bcb0_f122_44f9_b378_3f18e741a065.slice/crio-a8505aafba1367cd3659200bded1ad95767b57c89373a736bfc145ae8b41b455 WatchSource:0}: Error finding container a8505aafba1367cd3659200bded1ad95767b57c89373a736bfc145ae8b41b455: Status 404 returned error can't find the container with id a8505aafba1367cd3659200bded1ad95767b57c89373a736bfc145ae8b41b455 Apr 16 21:02:34.983835 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:34.983789 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd"] Apr 16 21:02:34.987668 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:02:34.987639 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7852b3f2_db76_4624_a66c_450474aeaa93.slice/crio-8d043b88ede8b61502ea02ade45bd9279a478203da6a200b1c90a91e9afe971a WatchSource:0}: Error finding container 8d043b88ede8b61502ea02ade45bd9279a478203da6a200b1c90a91e9afe971a: Status 404 returned error can't find the container with id 8d043b88ede8b61502ea02ade45bd9279a478203da6a200b1c90a91e9afe971a Apr 16 21:02:35.002921 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:35.002901 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-72rld"] Apr 16 21:02:35.005158 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:02:35.005128 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97e00ea1_e79f_4a6f_b820_0cafb65f4308.slice/crio-1490845a54eb2a34ea0da907079e241b3b614a4c99c6197ba2b08ae99513acf8 WatchSource:0}: Error finding container 1490845a54eb2a34ea0da907079e241b3b614a4c99c6197ba2b08ae99513acf8: Status 404 returned error can't find the container with id 1490845a54eb2a34ea0da907079e241b3b614a4c99c6197ba2b08ae99513acf8 Apr 16 21:02:35.504400 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:35.504360 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-b7bcv" event={"ID":"6767bcb0-f122-44f9-b378-3f18e741a065","Type":"ContainerStarted","Data":"a8505aafba1367cd3659200bded1ad95767b57c89373a736bfc145ae8b41b455"} Apr 16 21:02:35.505361 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:35.505340 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-72rld" event={"ID":"97e00ea1-e79f-4a6f-b820-0cafb65f4308","Type":"ContainerStarted","Data":"1490845a54eb2a34ea0da907079e241b3b614a4c99c6197ba2b08ae99513acf8"} Apr 16 21:02:35.506259 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:35.506240 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" event={"ID":"7852b3f2-db76-4624-a66c-450474aeaa93","Type":"ContainerStarted","Data":"8d043b88ede8b61502ea02ade45bd9279a478203da6a200b1c90a91e9afe971a"} Apr 16 21:02:37.512660 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:37.512623 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-72rld" event={"ID":"97e00ea1-e79f-4a6f-b820-0cafb65f4308","Type":"ContainerStarted","Data":"95e07b5e806a041c00e391e261aa26d08daa0a6e04af6249c8e4bec5c0d15cc0"} Apr 16 21:02:37.513957 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:37.513924 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" event={"ID":"7852b3f2-db76-4624-a66c-450474aeaa93","Type":"ContainerStarted","Data":"d6b2a12726e5881ce83c7243f0762e1e094158d9a51618a3925a620b869ce616"} Apr 16 21:02:37.515341 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:37.515320 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-b7bcv" event={"ID":"6767bcb0-f122-44f9-b378-3f18e741a065","Type":"ContainerStarted","Data":"cd88bf0e7e65136669cee9e28ecb17ed8cbef66e4b74cb55f5ac8b7bd0a078f2"} Apr 16 21:02:37.515341 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:37.515343 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-b7bcv" event={"ID":"6767bcb0-f122-44f9-b378-3f18e741a065","Type":"ContainerStarted","Data":"2cfa4fa612305d29eb4da9950e808932cc483829d9d0b903581a3fc865665664"} Apr 16 21:02:37.515556 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:37.515477 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-b7bcv" Apr 16 21:02:37.540953 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:37.540906 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-72rld" podStartSLOduration=251.489091121 podStartE2EDuration="4m13.540892524s" podCreationTimestamp="2026-04-16 20:58:24 +0000 UTC" firstStartedPulling="2026-04-16 21:02:35.006928708 +0000 UTC m=+282.796545691" lastFinishedPulling="2026-04-16 21:02:37.058730111 +0000 UTC m=+284.848347094" observedRunningTime="2026-04-16 21:02:37.539788408 +0000 UTC m=+285.329405423" watchObservedRunningTime="2026-04-16 21:02:37.540892524 +0000 UTC m=+285.330509563" Apr 16 21:02:37.564161 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:37.564115 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ck6cd" podStartSLOduration=265.504091116 podStartE2EDuration="4m27.564102225s" podCreationTimestamp="2026-04-16 20:58:10 +0000 UTC" firstStartedPulling="2026-04-16 21:02:34.989480382 +0000 UTC m=+282.779097364" lastFinishedPulling="2026-04-16 21:02:37.049491476 +0000 UTC m=+284.839108473" observedRunningTime="2026-04-16 21:02:37.563266384 +0000 UTC m=+285.352883389" watchObservedRunningTime="2026-04-16 21:02:37.564102225 +0000 UTC m=+285.353719230" Apr 16 21:02:47.521208 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:47.521172 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-b7bcv" Apr 16 21:02:47.539469 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:47.539400 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-b7bcv" podStartSLOduration=261.458420815 podStartE2EDuration="4m23.539383527s" podCreationTimestamp="2026-04-16 20:58:24 +0000 UTC" firstStartedPulling="2026-04-16 21:02:34.973324027 +0000 UTC m=+282.762941013" lastFinishedPulling="2026-04-16 21:02:37.054286739 +0000 UTC m=+284.843903725" observedRunningTime="2026-04-16 21:02:37.597270457 +0000 UTC m=+285.386887464" watchObservedRunningTime="2026-04-16 21:02:47.539383527 +0000 UTC m=+295.329000532" Apr 16 21:02:52.616996 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:52.616967 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzbn8_ffb608ad-9008-418c-a258-d80cf876c140/ovn-acl-logging/0.log" Apr 16 21:02:52.618793 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:52.618767 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzbn8_ffb608ad-9008-418c-a258-d80cf876c140/ovn-acl-logging/0.log" Apr 16 21:02:52.622061 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:02:52.622039 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 21:04:03.311861 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:03.311784 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wqbvh"] Apr 16 21:04:03.314677 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:03.314660 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wqbvh" Apr 16 21:04:03.317511 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:03.317490 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 21:04:03.318517 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:03.318502 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 21:04:03.318582 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:03.318541 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-n6w8p\"" Apr 16 21:04:03.322320 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:03.322302 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wqbvh"] Apr 16 21:04:03.342906 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:03.342881 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0cf30573-34f1-415d-b0be-ebf5aa3ddca8-tmp\") pod \"openshift-lws-operator-bfc7f696d-wqbvh\" (UID: \"0cf30573-34f1-415d-b0be-ebf5aa3ddca8\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wqbvh" Apr 16 21:04:03.343025 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:03.342916 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srcsn\" (UniqueName: \"kubernetes.io/projected/0cf30573-34f1-415d-b0be-ebf5aa3ddca8-kube-api-access-srcsn\") pod \"openshift-lws-operator-bfc7f696d-wqbvh\" (UID: \"0cf30573-34f1-415d-b0be-ebf5aa3ddca8\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wqbvh" Apr 16 21:04:03.443611 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:03.443576 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0cf30573-34f1-415d-b0be-ebf5aa3ddca8-tmp\") pod \"openshift-lws-operator-bfc7f696d-wqbvh\" (UID: \"0cf30573-34f1-415d-b0be-ebf5aa3ddca8\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wqbvh" Apr 16 21:04:03.443753 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:03.443617 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srcsn\" (UniqueName: \"kubernetes.io/projected/0cf30573-34f1-415d-b0be-ebf5aa3ddca8-kube-api-access-srcsn\") pod \"openshift-lws-operator-bfc7f696d-wqbvh\" (UID: \"0cf30573-34f1-415d-b0be-ebf5aa3ddca8\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wqbvh" Apr 16 21:04:03.443914 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:03.443897 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0cf30573-34f1-415d-b0be-ebf5aa3ddca8-tmp\") pod \"openshift-lws-operator-bfc7f696d-wqbvh\" (UID: \"0cf30573-34f1-415d-b0be-ebf5aa3ddca8\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wqbvh" Apr 16 21:04:03.452217 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:03.452187 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srcsn\" (UniqueName: \"kubernetes.io/projected/0cf30573-34f1-415d-b0be-ebf5aa3ddca8-kube-api-access-srcsn\") pod \"openshift-lws-operator-bfc7f696d-wqbvh\" (UID: \"0cf30573-34f1-415d-b0be-ebf5aa3ddca8\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wqbvh" Apr 16 21:04:03.624923 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:03.624829 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wqbvh" Apr 16 21:04:03.743409 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:03.743304 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wqbvh"] Apr 16 21:04:03.746086 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:04:03.746062 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cf30573_34f1_415d_b0be_ebf5aa3ddca8.slice/crio-8977232911b006212390a6944b79d337b48823f60c79b1bd92f299e3895c4437 WatchSource:0}: Error finding container 8977232911b006212390a6944b79d337b48823f60c79b1bd92f299e3895c4437: Status 404 returned error can't find the container with id 8977232911b006212390a6944b79d337b48823f60c79b1bd92f299e3895c4437 Apr 16 21:04:03.747291 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:03.747276 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:04:04.739059 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:04.739020 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wqbvh" event={"ID":"0cf30573-34f1-415d-b0be-ebf5aa3ddca8","Type":"ContainerStarted","Data":"8977232911b006212390a6944b79d337b48823f60c79b1bd92f299e3895c4437"} Apr 16 21:04:06.745329 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:06.745297 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wqbvh" event={"ID":"0cf30573-34f1-415d-b0be-ebf5aa3ddca8","Type":"ContainerStarted","Data":"b834a29f72a822a9c70f85a2cefc1e709a19ca2d22d182c5b9655b304f4a0ec6"} Apr 16 21:04:06.760663 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:06.760607 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wqbvh" podStartSLOduration=1.3127442280000001 podStartE2EDuration="3.760589586s" podCreationTimestamp="2026-04-16 21:04:03 +0000 UTC" firstStartedPulling="2026-04-16 21:04:03.747399532 +0000 UTC m=+371.537016515" lastFinishedPulling="2026-04-16 21:04:06.19524489 +0000 UTC m=+373.984861873" observedRunningTime="2026-04-16 21:04:06.760216726 +0000 UTC m=+374.549833731" watchObservedRunningTime="2026-04-16 21:04:06.760589586 +0000 UTC m=+374.550206605" Apr 16 21:04:22.688299 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:22.688255 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp"] Apr 16 21:04:22.691336 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:22.691315 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp" Apr 16 21:04:22.695150 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:22.695126 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 21:04:22.695266 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:22.695148 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 21:04:22.695266 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:22.695173 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 21:04:22.695266 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:22.695178 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-jjnvs\"" Apr 16 21:04:22.695608 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:22.695591 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 21:04:22.714187 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:22.714160 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp"] Apr 16 21:04:22.780928 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:22.780893 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d2b81e9-f4cc-4f8a-ac8f-86e433868873-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-9phcp\" (UID: \"0d2b81e9-f4cc-4f8a-ac8f-86e433868873\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp" Apr 16 21:04:22.781120 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:22.780939 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vdtw\" (UniqueName: \"kubernetes.io/projected/0d2b81e9-f4cc-4f8a-ac8f-86e433868873-kube-api-access-7vdtw\") pod \"opendatahub-operator-controller-manager-5f94c666bb-9phcp\" (UID: \"0d2b81e9-f4cc-4f8a-ac8f-86e433868873\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp" Apr 16 21:04:22.781120 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:22.780973 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d2b81e9-f4cc-4f8a-ac8f-86e433868873-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-9phcp\" (UID: \"0d2b81e9-f4cc-4f8a-ac8f-86e433868873\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp" Apr 16 21:04:22.882329 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:22.882303 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d2b81e9-f4cc-4f8a-ac8f-86e433868873-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-9phcp\" (UID: \"0d2b81e9-f4cc-4f8a-ac8f-86e433868873\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp" Apr 16 21:04:22.882506 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:22.882339 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vdtw\" (UniqueName: \"kubernetes.io/projected/0d2b81e9-f4cc-4f8a-ac8f-86e433868873-kube-api-access-7vdtw\") pod \"opendatahub-operator-controller-manager-5f94c666bb-9phcp\" (UID: \"0d2b81e9-f4cc-4f8a-ac8f-86e433868873\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp" Apr 16 21:04:22.882506 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:22.882363 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d2b81e9-f4cc-4f8a-ac8f-86e433868873-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-9phcp\" (UID: \"0d2b81e9-f4cc-4f8a-ac8f-86e433868873\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp" Apr 16 21:04:22.884687 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:22.884659 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d2b81e9-f4cc-4f8a-ac8f-86e433868873-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-9phcp\" (UID: \"0d2b81e9-f4cc-4f8a-ac8f-86e433868873\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp" Apr 16 21:04:22.884815 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:22.884695 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d2b81e9-f4cc-4f8a-ac8f-86e433868873-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f94c666bb-9phcp\" (UID: \"0d2b81e9-f4cc-4f8a-ac8f-86e433868873\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp" Apr 16 21:04:22.892194 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:22.892170 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vdtw\" (UniqueName: \"kubernetes.io/projected/0d2b81e9-f4cc-4f8a-ac8f-86e433868873-kube-api-access-7vdtw\") pod \"opendatahub-operator-controller-manager-5f94c666bb-9phcp\" (UID: \"0d2b81e9-f4cc-4f8a-ac8f-86e433868873\") " pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp" Apr 16 21:04:23.002130 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:23.002034 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp" Apr 16 21:04:23.147737 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:23.147703 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp"] Apr 16 21:04:23.150809 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:04:23.150779 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d2b81e9_f4cc_4f8a_ac8f_86e433868873.slice/crio-000b92063350fc18eb78e6a65489818aceca958e53c0675307c37ddc87c57a03 WatchSource:0}: Error finding container 000b92063350fc18eb78e6a65489818aceca958e53c0675307c37ddc87c57a03: Status 404 returned error can't find the container with id 000b92063350fc18eb78e6a65489818aceca958e53c0675307c37ddc87c57a03 Apr 16 21:04:23.787176 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:23.787134 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp" event={"ID":"0d2b81e9-f4cc-4f8a-ac8f-86e433868873","Type":"ContainerStarted","Data":"000b92063350fc18eb78e6a65489818aceca958e53c0675307c37ddc87c57a03"} Apr 16 21:04:25.794910 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:25.794869 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp" event={"ID":"0d2b81e9-f4cc-4f8a-ac8f-86e433868873","Type":"ContainerStarted","Data":"432099ae463c4dfcbf6ac1efe395cd98c55ceb724bc0cf8b89597ab8f59f2c6a"} Apr 16 21:04:25.795268 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:25.795053 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp" Apr 16 21:04:25.815808 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:25.815760 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp" podStartSLOduration=1.263850241 podStartE2EDuration="3.815748233s" podCreationTimestamp="2026-04-16 21:04:22 +0000 UTC" firstStartedPulling="2026-04-16 21:04:23.152496958 +0000 UTC m=+390.942113941" lastFinishedPulling="2026-04-16 21:04:25.704394951 +0000 UTC m=+393.494011933" observedRunningTime="2026-04-16 21:04:25.813734412 +0000 UTC m=+393.603351417" watchObservedRunningTime="2026-04-16 21:04:25.815748233 +0000 UTC m=+393.605365237" Apr 16 21:04:36.799594 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:36.799557 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5f94c666bb-9phcp" Apr 16 21:04:42.095004 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.094966 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-56b49765cd-cszkj"] Apr 16 21:04:42.102117 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.102085 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-56b49765cd-cszkj" Apr 16 21:04:42.109747 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.109725 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 21:04:42.109867 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.109727 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 21:04:42.110181 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.110162 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 21:04:42.111428 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.111382 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-x4wfj\"" Apr 16 21:04:42.113249 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.113224 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-56b49765cd-cszkj"] Apr 16 21:04:42.113395 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.113379 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 21:04:42.220817 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.220780 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4t7v\" (UniqueName: \"kubernetes.io/projected/4ed78395-b9d7-4a62-b308-e860cc2c8ce5-kube-api-access-c4t7v\") pod \"kube-auth-proxy-56b49765cd-cszkj\" (UID: \"4ed78395-b9d7-4a62-b308-e860cc2c8ce5\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-cszkj" Apr 16 21:04:42.220987 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.220873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ed78395-b9d7-4a62-b308-e860cc2c8ce5-tmp\") pod \"kube-auth-proxy-56b49765cd-cszkj\" (UID: \"4ed78395-b9d7-4a62-b308-e860cc2c8ce5\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-cszkj" Apr 16 21:04:42.220987 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.220897 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed78395-b9d7-4a62-b308-e860cc2c8ce5-tls-certs\") pod \"kube-auth-proxy-56b49765cd-cszkj\" (UID: \"4ed78395-b9d7-4a62-b308-e860cc2c8ce5\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-cszkj" Apr 16 21:04:42.321984 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.321947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ed78395-b9d7-4a62-b308-e860cc2c8ce5-tmp\") pod \"kube-auth-proxy-56b49765cd-cszkj\" (UID: \"4ed78395-b9d7-4a62-b308-e860cc2c8ce5\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-cszkj" Apr 16 21:04:42.321984 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.321988 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed78395-b9d7-4a62-b308-e860cc2c8ce5-tls-certs\") pod \"kube-auth-proxy-56b49765cd-cszkj\" (UID: \"4ed78395-b9d7-4a62-b308-e860cc2c8ce5\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-cszkj" Apr 16 21:04:42.322215 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.322131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4t7v\" (UniqueName: \"kubernetes.io/projected/4ed78395-b9d7-4a62-b308-e860cc2c8ce5-kube-api-access-c4t7v\") pod \"kube-auth-proxy-56b49765cd-cszkj\" (UID: \"4ed78395-b9d7-4a62-b308-e860cc2c8ce5\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-cszkj" Apr 16 21:04:42.324670 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.324651 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed78395-b9d7-4a62-b308-e860cc2c8ce5-tls-certs\") pod \"kube-auth-proxy-56b49765cd-cszkj\" (UID: \"4ed78395-b9d7-4a62-b308-e860cc2c8ce5\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-cszkj" Apr 16 21:04:42.326007 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.325992 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ed78395-b9d7-4a62-b308-e860cc2c8ce5-tmp\") pod \"kube-auth-proxy-56b49765cd-cszkj\" (UID: \"4ed78395-b9d7-4a62-b308-e860cc2c8ce5\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-cszkj" Apr 16 21:04:42.330397 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.330371 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4t7v\" (UniqueName: \"kubernetes.io/projected/4ed78395-b9d7-4a62-b308-e860cc2c8ce5-kube-api-access-c4t7v\") pod \"kube-auth-proxy-56b49765cd-cszkj\" (UID: \"4ed78395-b9d7-4a62-b308-e860cc2c8ce5\") " pod="openshift-ingress/kube-auth-proxy-56b49765cd-cszkj" Apr 16 21:04:42.411165 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.411140 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-56b49765cd-cszkj" Apr 16 21:04:42.530127 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.530094 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-56b49765cd-cszkj"] Apr 16 21:04:42.534387 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:04:42.534357 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ed78395_b9d7_4a62_b308_e860cc2c8ce5.slice/crio-c55938a24f8e5fe26fa8146bd7807e515c83e9553a2a86771f64884710243ae7 WatchSource:0}: Error finding container c55938a24f8e5fe26fa8146bd7807e515c83e9553a2a86771f64884710243ae7: Status 404 returned error can't find the container with id c55938a24f8e5fe26fa8146bd7807e515c83e9553a2a86771f64884710243ae7 Apr 16 21:04:42.842614 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:42.842530 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-56b49765cd-cszkj" event={"ID":"4ed78395-b9d7-4a62-b308-e860cc2c8ce5","Type":"ContainerStarted","Data":"c55938a24f8e5fe26fa8146bd7807e515c83e9553a2a86771f64884710243ae7"} Apr 16 21:04:44.878030 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:44.877997 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-9mpb6"] Apr 16 21:04:44.881016 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:44.880999 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" Apr 16 21:04:44.883900 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:44.883868 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 16 21:04:44.884022 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:44.883960 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-qwrwh\"" Apr 16 21:04:44.890921 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:44.890897 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-9mpb6"] Apr 16 21:04:44.944675 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:44.944641 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/900eb218-7e34-4aad-9e70-a8cb276b5b9f-cert\") pod \"odh-model-controller-858dbf95b8-9mpb6\" (UID: \"900eb218-7e34-4aad-9e70-a8cb276b5b9f\") " pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" Apr 16 21:04:44.944862 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:44.944698 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trvgn\" (UniqueName: \"kubernetes.io/projected/900eb218-7e34-4aad-9e70-a8cb276b5b9f-kube-api-access-trvgn\") pod \"odh-model-controller-858dbf95b8-9mpb6\" (UID: \"900eb218-7e34-4aad-9e70-a8cb276b5b9f\") " pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" Apr 16 21:04:45.046045 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:45.046007 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trvgn\" (UniqueName: \"kubernetes.io/projected/900eb218-7e34-4aad-9e70-a8cb276b5b9f-kube-api-access-trvgn\") pod \"odh-model-controller-858dbf95b8-9mpb6\" (UID: \"900eb218-7e34-4aad-9e70-a8cb276b5b9f\") " pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" Apr 16 21:04:45.046197 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:45.046101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/900eb218-7e34-4aad-9e70-a8cb276b5b9f-cert\") pod \"odh-model-controller-858dbf95b8-9mpb6\" (UID: \"900eb218-7e34-4aad-9e70-a8cb276b5b9f\") " pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" Apr 16 21:04:45.046268 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:04:45.046246 2575 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 21:04:45.046328 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:04:45.046319 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/900eb218-7e34-4aad-9e70-a8cb276b5b9f-cert podName:900eb218-7e34-4aad-9e70-a8cb276b5b9f nodeName:}" failed. No retries permitted until 2026-04-16 21:04:45.546303364 +0000 UTC m=+413.335920347 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/900eb218-7e34-4aad-9e70-a8cb276b5b9f-cert") pod "odh-model-controller-858dbf95b8-9mpb6" (UID: "900eb218-7e34-4aad-9e70-a8cb276b5b9f") : secret "odh-model-controller-webhook-cert" not found Apr 16 21:04:45.057148 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:45.057120 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trvgn\" (UniqueName: \"kubernetes.io/projected/900eb218-7e34-4aad-9e70-a8cb276b5b9f-kube-api-access-trvgn\") pod \"odh-model-controller-858dbf95b8-9mpb6\" (UID: \"900eb218-7e34-4aad-9e70-a8cb276b5b9f\") " pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" Apr 16 21:04:45.550486 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:45.550413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/900eb218-7e34-4aad-9e70-a8cb276b5b9f-cert\") pod \"odh-model-controller-858dbf95b8-9mpb6\" (UID: \"900eb218-7e34-4aad-9e70-a8cb276b5b9f\") " pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" Apr 16 21:04:45.550672 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:04:45.550558 2575 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 21:04:45.550672 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:04:45.550642 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/900eb218-7e34-4aad-9e70-a8cb276b5b9f-cert podName:900eb218-7e34-4aad-9e70-a8cb276b5b9f nodeName:}" failed. No retries permitted until 2026-04-16 21:04:46.550619783 +0000 UTC m=+414.340236769 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/900eb218-7e34-4aad-9e70-a8cb276b5b9f-cert") pod "odh-model-controller-858dbf95b8-9mpb6" (UID: "900eb218-7e34-4aad-9e70-a8cb276b5b9f") : secret "odh-model-controller-webhook-cert" not found Apr 16 21:04:45.853131 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:45.853041 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-56b49765cd-cszkj" event={"ID":"4ed78395-b9d7-4a62-b308-e860cc2c8ce5","Type":"ContainerStarted","Data":"2639caef84fa4c57b438daf6a9ee784a6e23ef3dec3e1b21d90cf212ded4fa94"} Apr 16 21:04:45.871175 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:45.871127 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-56b49765cd-cszkj" podStartSLOduration=0.65982014 podStartE2EDuration="3.871112399s" podCreationTimestamp="2026-04-16 21:04:42 +0000 UTC" firstStartedPulling="2026-04-16 21:04:42.536112754 +0000 UTC m=+410.325729737" lastFinishedPulling="2026-04-16 21:04:45.74740501 +0000 UTC m=+413.537021996" observedRunningTime="2026-04-16 21:04:45.869954123 +0000 UTC m=+413.659571128" watchObservedRunningTime="2026-04-16 21:04:45.871112399 +0000 UTC m=+413.660729404" Apr 16 21:04:46.557515 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:46.557472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/900eb218-7e34-4aad-9e70-a8cb276b5b9f-cert\") pod \"odh-model-controller-858dbf95b8-9mpb6\" (UID: \"900eb218-7e34-4aad-9e70-a8cb276b5b9f\") " pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" Apr 16 21:04:46.559776 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:46.559755 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/900eb218-7e34-4aad-9e70-a8cb276b5b9f-cert\") pod \"odh-model-controller-858dbf95b8-9mpb6\" (UID: \"900eb218-7e34-4aad-9e70-a8cb276b5b9f\") " pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" Apr 16 21:04:46.691547 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:46.691517 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" Apr 16 21:04:47.021398 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:47.021369 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-9mpb6"] Apr 16 21:04:47.023493 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:04:47.023460 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod900eb218_7e34_4aad_9e70_a8cb276b5b9f.slice/crio-b1f2e205242286f73ad338741dc4d6343203df716d8a90b6cc8edaa5ed416502 WatchSource:0}: Error finding container b1f2e205242286f73ad338741dc4d6343203df716d8a90b6cc8edaa5ed416502: Status 404 returned error can't find the container with id b1f2e205242286f73ad338741dc4d6343203df716d8a90b6cc8edaa5ed416502 Apr 16 21:04:47.860074 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:47.860037 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" event={"ID":"900eb218-7e34-4aad-9e70-a8cb276b5b9f","Type":"ContainerStarted","Data":"b1f2e205242286f73ad338741dc4d6343203df716d8a90b6cc8edaa5ed416502"} Apr 16 21:04:49.867613 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:49.867584 2575 generic.go:358] "Generic (PLEG): container finished" podID="900eb218-7e34-4aad-9e70-a8cb276b5b9f" containerID="ee5ae4af7c249107ee70eea969bce468f8d65d8c012ca4e8115ce9f7561cba51" exitCode=1 Apr 16 21:04:49.867968 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:49.867675 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" event={"ID":"900eb218-7e34-4aad-9e70-a8cb276b5b9f","Type":"ContainerDied","Data":"ee5ae4af7c249107ee70eea969bce468f8d65d8c012ca4e8115ce9f7561cba51"} Apr 16 21:04:49.867968 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:49.867927 2575 scope.go:117] "RemoveContainer" containerID="ee5ae4af7c249107ee70eea969bce468f8d65d8c012ca4e8115ce9f7561cba51" Apr 16 21:04:50.871970 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:50.871932 2575 generic.go:358] "Generic (PLEG): container finished" podID="900eb218-7e34-4aad-9e70-a8cb276b5b9f" containerID="d19449797159210432a8880205dfd5fd4adc9cf554cb0010b1efcc89afc6f8ef" exitCode=1 Apr 16 21:04:50.872391 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:50.871988 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" event={"ID":"900eb218-7e34-4aad-9e70-a8cb276b5b9f","Type":"ContainerDied","Data":"d19449797159210432a8880205dfd5fd4adc9cf554cb0010b1efcc89afc6f8ef"} Apr 16 21:04:50.872391 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:50.872027 2575 scope.go:117] "RemoveContainer" containerID="ee5ae4af7c249107ee70eea969bce468f8d65d8c012ca4e8115ce9f7561cba51" Apr 16 21:04:50.872391 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:50.872243 2575 scope.go:117] "RemoveContainer" containerID="d19449797159210432a8880205dfd5fd4adc9cf554cb0010b1efcc89afc6f8ef" Apr 16 21:04:50.872528 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:04:50.872418 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-9mpb6_opendatahub(900eb218-7e34-4aad-9e70-a8cb276b5b9f)\"" pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" podUID="900eb218-7e34-4aad-9e70-a8cb276b5b9f" Apr 16 21:04:51.876619 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:51.876591 2575 scope.go:117] "RemoveContainer" containerID="d19449797159210432a8880205dfd5fd4adc9cf554cb0010b1efcc89afc6f8ef" Apr 16 21:04:51.876986 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:04:51.876758 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-9mpb6_opendatahub(900eb218-7e34-4aad-9e70-a8cb276b5b9f)\"" pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" podUID="900eb218-7e34-4aad-9e70-a8cb276b5b9f" Apr 16 21:04:52.395603 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:52.395571 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-5khsz"] Apr 16 21:04:52.399703 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:52.399684 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-5khsz" Apr 16 21:04:52.404475 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:52.404452 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 16 21:04:52.404555 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:52.404508 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-6b4cs\"" Apr 16 21:04:52.427483 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:52.427452 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-5khsz"] Apr 16 21:04:52.507880 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:52.507841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6ef7701-a721-4236-977d-d94cb403e9b2-cert\") pod \"kserve-controller-manager-856948b99f-5khsz\" (UID: \"e6ef7701-a721-4236-977d-d94cb403e9b2\") " pod="opendatahub/kserve-controller-manager-856948b99f-5khsz" Apr 16 21:04:52.508107 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:52.507898 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkts8\" (UniqueName: \"kubernetes.io/projected/e6ef7701-a721-4236-977d-d94cb403e9b2-kube-api-access-nkts8\") pod \"kserve-controller-manager-856948b99f-5khsz\" (UID: \"e6ef7701-a721-4236-977d-d94cb403e9b2\") " pod="opendatahub/kserve-controller-manager-856948b99f-5khsz" Apr 16 21:04:52.608846 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:52.608804 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkts8\" (UniqueName: \"kubernetes.io/projected/e6ef7701-a721-4236-977d-d94cb403e9b2-kube-api-access-nkts8\") pod \"kserve-controller-manager-856948b99f-5khsz\" (UID: \"e6ef7701-a721-4236-977d-d94cb403e9b2\") " pod="opendatahub/kserve-controller-manager-856948b99f-5khsz" Apr 16 21:04:52.609024 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:52.608959 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6ef7701-a721-4236-977d-d94cb403e9b2-cert\") pod \"kserve-controller-manager-856948b99f-5khsz\" (UID: \"e6ef7701-a721-4236-977d-d94cb403e9b2\") " pod="opendatahub/kserve-controller-manager-856948b99f-5khsz" Apr 16 21:04:52.612410 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:52.612391 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 16 21:04:52.620058 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:04:52.620041 2575 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 21:04:52.620152 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:04:52.620103 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6ef7701-a721-4236-977d-d94cb403e9b2-cert podName:e6ef7701-a721-4236-977d-d94cb403e9b2 nodeName:}" failed. No retries permitted until 2026-04-16 21:04:53.120082239 +0000 UTC m=+420.909699222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e6ef7701-a721-4236-977d-d94cb403e9b2-cert") pod "kserve-controller-manager-856948b99f-5khsz" (UID: "e6ef7701-a721-4236-977d-d94cb403e9b2") : secret "kserve-webhook-server-cert" not found Apr 16 21:04:52.655571 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:52.655503 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkts8\" (UniqueName: \"kubernetes.io/projected/e6ef7701-a721-4236-977d-d94cb403e9b2-kube-api-access-nkts8\") pod \"kserve-controller-manager-856948b99f-5khsz\" (UID: \"e6ef7701-a721-4236-977d-d94cb403e9b2\") " pod="opendatahub/kserve-controller-manager-856948b99f-5khsz" Apr 16 21:04:53.213585 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:53.213547 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6ef7701-a721-4236-977d-d94cb403e9b2-cert\") pod \"kserve-controller-manager-856948b99f-5khsz\" (UID: \"e6ef7701-a721-4236-977d-d94cb403e9b2\") " pod="opendatahub/kserve-controller-manager-856948b99f-5khsz" Apr 16 21:04:53.215938 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:53.215914 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6ef7701-a721-4236-977d-d94cb403e9b2-cert\") pod \"kserve-controller-manager-856948b99f-5khsz\" (UID: \"e6ef7701-a721-4236-977d-d94cb403e9b2\") " pod="opendatahub/kserve-controller-manager-856948b99f-5khsz" Apr 16 21:04:53.312594 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:53.312561 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-6b4cs\"" Apr 16 21:04:53.319695 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:53.319654 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-5khsz" Apr 16 21:04:53.449022 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:53.448956 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-5khsz"] Apr 16 21:04:53.452550 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:04:53.452516 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6ef7701_a721_4236_977d_d94cb403e9b2.slice/crio-87d98b68152b48e560a8be213f99e2c80108858a3e6c134277cd6c5b09b11c26 WatchSource:0}: Error finding container 87d98b68152b48e560a8be213f99e2c80108858a3e6c134277cd6c5b09b11c26: Status 404 returned error can't find the container with id 87d98b68152b48e560a8be213f99e2c80108858a3e6c134277cd6c5b09b11c26 Apr 16 21:04:53.771624 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:53.771597 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-r9snz"] Apr 16 21:04:53.776165 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:53.776147 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-r9snz" Apr 16 21:04:53.782314 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:53.782282 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 21:04:53.782476 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:53.782310 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 21:04:53.782476 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:53.782413 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-g85c5\"" Apr 16 21:04:53.808507 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:53.808473 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-r9snz"] Apr 16 21:04:53.883377 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:53.883340 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-5khsz" event={"ID":"e6ef7701-a721-4236-977d-d94cb403e9b2","Type":"ContainerStarted","Data":"87d98b68152b48e560a8be213f99e2c80108858a3e6c134277cd6c5b09b11c26"} Apr 16 21:04:53.919686 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:53.919644 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv6tz\" (UniqueName: \"kubernetes.io/projected/4fc514d5-e467-4d11-8aa3-83ce677ef5bc-kube-api-access-cv6tz\") pod \"servicemesh-operator3-55f49c5f94-r9snz\" (UID: \"4fc514d5-e467-4d11-8aa3-83ce677ef5bc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-r9snz" Apr 16 21:04:53.919834 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:53.919705 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/4fc514d5-e467-4d11-8aa3-83ce677ef5bc-operator-config\") pod \"servicemesh-operator3-55f49c5f94-r9snz\" (UID: \"4fc514d5-e467-4d11-8aa3-83ce677ef5bc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-r9snz" Apr 16 21:04:54.021090 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:54.021051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cv6tz\" (UniqueName: \"kubernetes.io/projected/4fc514d5-e467-4d11-8aa3-83ce677ef5bc-kube-api-access-cv6tz\") pod \"servicemesh-operator3-55f49c5f94-r9snz\" (UID: \"4fc514d5-e467-4d11-8aa3-83ce677ef5bc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-r9snz" Apr 16 21:04:54.021287 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:54.021116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/4fc514d5-e467-4d11-8aa3-83ce677ef5bc-operator-config\") pod \"servicemesh-operator3-55f49c5f94-r9snz\" (UID: \"4fc514d5-e467-4d11-8aa3-83ce677ef5bc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-r9snz" Apr 16 21:04:54.023620 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:54.023560 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/4fc514d5-e467-4d11-8aa3-83ce677ef5bc-operator-config\") pod \"servicemesh-operator3-55f49c5f94-r9snz\" (UID: \"4fc514d5-e467-4d11-8aa3-83ce677ef5bc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-r9snz" Apr 16 21:04:54.030611 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:54.030589 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv6tz\" (UniqueName: \"kubernetes.io/projected/4fc514d5-e467-4d11-8aa3-83ce677ef5bc-kube-api-access-cv6tz\") pod \"servicemesh-operator3-55f49c5f94-r9snz\" (UID: \"4fc514d5-e467-4d11-8aa3-83ce677ef5bc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-r9snz" Apr 16 21:04:54.085770 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:54.085741 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-r9snz" Apr 16 21:04:54.209237 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:54.209216 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-r9snz"] Apr 16 21:04:54.211425 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:04:54.211395 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fc514d5_e467_4d11_8aa3_83ce677ef5bc.slice/crio-31b9a2bceba7c8e13af1e12f97c14794bd894243ade0d55c5df1b6c4b197aa24 WatchSource:0}: Error finding container 31b9a2bceba7c8e13af1e12f97c14794bd894243ade0d55c5df1b6c4b197aa24: Status 404 returned error can't find the container with id 31b9a2bceba7c8e13af1e12f97c14794bd894243ade0d55c5df1b6c4b197aa24 Apr 16 21:04:54.889610 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:54.889542 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-r9snz" event={"ID":"4fc514d5-e467-4d11-8aa3-83ce677ef5bc","Type":"ContainerStarted","Data":"31b9a2bceba7c8e13af1e12f97c14794bd894243ade0d55c5df1b6c4b197aa24"} Apr 16 21:04:56.692508 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:56.692466 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" Apr 16 21:04:56.693001 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:56.692923 2575 scope.go:117] "RemoveContainer" containerID="d19449797159210432a8880205dfd5fd4adc9cf554cb0010b1efcc89afc6f8ef" Apr 16 21:04:56.693130 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:04:56.693105 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-9mpb6_opendatahub(900eb218-7e34-4aad-9e70-a8cb276b5b9f)\"" pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" podUID="900eb218-7e34-4aad-9e70-a8cb276b5b9f" Apr 16 21:04:57.902478 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:57.902424 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-5khsz" event={"ID":"e6ef7701-a721-4236-977d-d94cb403e9b2","Type":"ContainerStarted","Data":"5cb37da7ce8d6b7bb73a38d65d005a93493b9a54f45e5b86cb08a8fbe2c75999"} Apr 16 21:04:57.902921 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:57.902523 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-5khsz" Apr 16 21:04:57.903900 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:57.903872 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-r9snz" event={"ID":"4fc514d5-e467-4d11-8aa3-83ce677ef5bc","Type":"ContainerStarted","Data":"f9922d9d26488600d0f4b21ab0e8e621136edbcaa730cf5958b9bfd6cf4e9e1b"} Apr 16 21:04:57.904030 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:57.904006 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-r9snz" Apr 16 21:04:57.921841 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:57.921793 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-5khsz" podStartSLOduration=2.3635005 podStartE2EDuration="5.921780251s" podCreationTimestamp="2026-04-16 21:04:52 +0000 UTC" firstStartedPulling="2026-04-16 21:04:53.453829244 +0000 UTC m=+421.243446227" lastFinishedPulling="2026-04-16 21:04:57.012108992 +0000 UTC m=+424.801725978" observedRunningTime="2026-04-16 21:04:57.920339697 +0000 UTC m=+425.709956703" watchObservedRunningTime="2026-04-16 21:04:57.921780251 +0000 UTC m=+425.711397255" Apr 16 21:04:57.942570 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:04:57.942513 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-r9snz" podStartSLOduration=2.140287891 podStartE2EDuration="4.94249501s" podCreationTimestamp="2026-04-16 21:04:53 +0000 UTC" firstStartedPulling="2026-04-16 21:04:54.214012576 +0000 UTC m=+422.003629560" lastFinishedPulling="2026-04-16 21:04:57.016219693 +0000 UTC m=+424.805836679" observedRunningTime="2026-04-16 21:04:57.940378195 +0000 UTC m=+425.729995213" watchObservedRunningTime="2026-04-16 21:04:57.94249501 +0000 UTC m=+425.732112017" Apr 16 21:05:06.691948 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:06.691899 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" Apr 16 21:05:06.692338 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:06.692286 2575 scope.go:117] "RemoveContainer" containerID="d19449797159210432a8880205dfd5fd4adc9cf554cb0010b1efcc89afc6f8ef" Apr 16 21:05:07.936638 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:07.936597 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" event={"ID":"900eb218-7e34-4aad-9e70-a8cb276b5b9f","Type":"ContainerStarted","Data":"1e68a875ecedb38b2b1317637e38681147119d58e5171ef8bfddee077d4ba504"} Apr 16 21:05:07.937134 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:07.936811 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" Apr 16 21:05:07.960015 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:07.959963 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" podStartSLOduration=4.025500305 podStartE2EDuration="23.959950718s" podCreationTimestamp="2026-04-16 21:04:44 +0000 UTC" firstStartedPulling="2026-04-16 21:04:47.025122624 +0000 UTC m=+414.814739607" lastFinishedPulling="2026-04-16 21:05:06.959573023 +0000 UTC m=+434.749190020" observedRunningTime="2026-04-16 21:05:07.958951769 +0000 UTC m=+435.748568786" watchObservedRunningTime="2026-04-16 21:05:07.959950718 +0000 UTC m=+435.749567751" Apr 16 21:05:08.909525 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:08.909493 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-r9snz" Apr 16 21:05:18.448058 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.448026 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs"] Apr 16 21:05:18.455217 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.455190 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.458218 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.458196 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 21:05:18.458218 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.458209 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 21:05:18.458395 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.458237 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 21:05:18.458474 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.458458 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 21:05:18.458521 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.458461 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-z4hvv\"" Apr 16 21:05:18.463269 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.463246 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs"] Apr 16 21:05:18.514489 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.514434 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/34b44077-1c57-46bf-a8ce-9eb591c5f352-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.514489 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.514486 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/34b44077-1c57-46bf-a8ce-9eb591c5f352-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.514675 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.514509 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62f4z\" (UniqueName: \"kubernetes.io/projected/34b44077-1c57-46bf-a8ce-9eb591c5f352-kube-api-access-62f4z\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.514675 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.514529 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/34b44077-1c57-46bf-a8ce-9eb591c5f352-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.514675 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.514552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/34b44077-1c57-46bf-a8ce-9eb591c5f352-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.514675 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.514578 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/34b44077-1c57-46bf-a8ce-9eb591c5f352-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.514675 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.514644 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/34b44077-1c57-46bf-a8ce-9eb591c5f352-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.615548 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.615501 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/34b44077-1c57-46bf-a8ce-9eb591c5f352-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.615762 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.615565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/34b44077-1c57-46bf-a8ce-9eb591c5f352-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.615762 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.615608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/34b44077-1c57-46bf-a8ce-9eb591c5f352-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.615762 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.615632 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/34b44077-1c57-46bf-a8ce-9eb591c5f352-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.615762 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.615659 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62f4z\" (UniqueName: \"kubernetes.io/projected/34b44077-1c57-46bf-a8ce-9eb591c5f352-kube-api-access-62f4z\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.615762 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.615692 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/34b44077-1c57-46bf-a8ce-9eb591c5f352-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.615762 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.615728 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/34b44077-1c57-46bf-a8ce-9eb591c5f352-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.616490 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.616407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/34b44077-1c57-46bf-a8ce-9eb591c5f352-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.618069 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.618044 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/34b44077-1c57-46bf-a8ce-9eb591c5f352-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.618188 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.618102 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/34b44077-1c57-46bf-a8ce-9eb591c5f352-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.618188 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.618112 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/34b44077-1c57-46bf-a8ce-9eb591c5f352-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.618304 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.618253 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/34b44077-1c57-46bf-a8ce-9eb591c5f352-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.624085 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.624059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/34b44077-1c57-46bf-a8ce-9eb591c5f352-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.625115 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.625094 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62f4z\" (UniqueName: \"kubernetes.io/projected/34b44077-1c57-46bf-a8ce-9eb591c5f352-kube-api-access-62f4z\") pod \"istiod-openshift-gateway-55ff986f96-vv8xs\" (UID: \"34b44077-1c57-46bf-a8ce-9eb591c5f352\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.765079 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.764987 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:18.897483 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.896942 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs"] Apr 16 21:05:18.905975 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:05:18.905942 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34b44077_1c57_46bf_a8ce_9eb591c5f352.slice/crio-cba6dadaf3ca913eea1340c589bd9e57336484b1dd17f2251aad7ef2cdd8e5c1 WatchSource:0}: Error finding container cba6dadaf3ca913eea1340c589bd9e57336484b1dd17f2251aad7ef2cdd8e5c1: Status 404 returned error can't find the container with id cba6dadaf3ca913eea1340c589bd9e57336484b1dd17f2251aad7ef2cdd8e5c1 Apr 16 21:05:18.942874 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.942843 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-9mpb6" Apr 16 21:05:18.970125 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:18.970084 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" event={"ID":"34b44077-1c57-46bf-a8ce-9eb591c5f352","Type":"ContainerStarted","Data":"cba6dadaf3ca913eea1340c589bd9e57336484b1dd17f2251aad7ef2cdd8e5c1"} Apr 16 21:05:21.388449 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:21.388396 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 21:05:21.388706 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:21.388494 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 21:05:21.986053 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:21.986015 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" event={"ID":"34b44077-1c57-46bf-a8ce-9eb591c5f352","Type":"ContainerStarted","Data":"1298317cec2ce8c6a0afd3d0802b099d5146f8feba3927babd928a4c8940b7e3"} Apr 16 21:05:21.986231 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:21.986161 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:22.008847 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:22.008798 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" podStartSLOduration=1.529241844 podStartE2EDuration="4.008783526s" podCreationTimestamp="2026-04-16 21:05:18 +0000 UTC" firstStartedPulling="2026-04-16 21:05:18.908567787 +0000 UTC m=+446.698184774" lastFinishedPulling="2026-04-16 21:05:21.388109461 +0000 UTC m=+449.177726456" observedRunningTime="2026-04-16 21:05:22.007123457 +0000 UTC m=+449.796740475" watchObservedRunningTime="2026-04-16 21:05:22.008783526 +0000 UTC m=+449.798400533" Apr 16 21:05:22.992546 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:22.992514 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv8xs" Apr 16 21:05:28.912705 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:05:28.912670 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-5khsz" Apr 16 21:06:22.196890 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:22.196854 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95"] Apr 16 21:06:22.200055 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:22.200037 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" Apr 16 21:06:22.202960 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:22.202934 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-ldzvv\"" Apr 16 21:06:22.203057 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:22.202984 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 21:06:22.203671 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:22.203644 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 21:06:22.216477 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:22.216427 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95"] Apr 16 21:06:22.303844 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:22.303806 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/013fbf1d-9e6a-4b37-918a-546e3a10cd1f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-nks95\" (UID: \"013fbf1d-9e6a-4b37-918a-546e3a10cd1f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" Apr 16 21:06:22.303844 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:22.303845 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb95j\" (UniqueName: \"kubernetes.io/projected/013fbf1d-9e6a-4b37-918a-546e3a10cd1f-kube-api-access-qb95j\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-nks95\" (UID: \"013fbf1d-9e6a-4b37-918a-546e3a10cd1f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" Apr 16 21:06:22.404966 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:22.404929 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/013fbf1d-9e6a-4b37-918a-546e3a10cd1f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-nks95\" (UID: \"013fbf1d-9e6a-4b37-918a-546e3a10cd1f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" Apr 16 21:06:22.404966 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:22.404965 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qb95j\" (UniqueName: \"kubernetes.io/projected/013fbf1d-9e6a-4b37-918a-546e3a10cd1f-kube-api-access-qb95j\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-nks95\" (UID: \"013fbf1d-9e6a-4b37-918a-546e3a10cd1f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" Apr 16 21:06:22.405360 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:22.405335 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/013fbf1d-9e6a-4b37-918a-546e3a10cd1f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-nks95\" (UID: \"013fbf1d-9e6a-4b37-918a-546e3a10cd1f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" Apr 16 21:06:22.415648 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:22.415624 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb95j\" (UniqueName: \"kubernetes.io/projected/013fbf1d-9e6a-4b37-918a-546e3a10cd1f-kube-api-access-qb95j\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-nks95\" (UID: \"013fbf1d-9e6a-4b37-918a-546e3a10cd1f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" Apr 16 21:06:22.510359 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:22.510267 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" Apr 16 21:06:22.636123 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:22.636070 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95"] Apr 16 21:06:22.638341 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:06:22.638315 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod013fbf1d_9e6a_4b37_918a_546e3a10cd1f.slice/crio-991d5281e318e80b2aa08c458f31a7af65d117dcb99c294e339345e67c1096c6 WatchSource:0}: Error finding container 991d5281e318e80b2aa08c458f31a7af65d117dcb99c294e339345e67c1096c6: Status 404 returned error can't find the container with id 991d5281e318e80b2aa08c458f31a7af65d117dcb99c294e339345e67c1096c6 Apr 16 21:06:23.173146 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:23.173092 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" event={"ID":"013fbf1d-9e6a-4b37-918a-546e3a10cd1f","Type":"ContainerStarted","Data":"991d5281e318e80b2aa08c458f31a7af65d117dcb99c294e339345e67c1096c6"} Apr 16 21:06:29.195175 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:29.195137 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" event={"ID":"013fbf1d-9e6a-4b37-918a-546e3a10cd1f","Type":"ContainerStarted","Data":"40774251735d364e891fb75a428c5876c65a907f6c5639fde2041dcce1322446"} Apr 16 21:06:29.195590 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:29.195195 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" Apr 16 21:06:29.219702 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:29.219653 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" podStartSLOduration=1.116939188 podStartE2EDuration="7.219638414s" podCreationTimestamp="2026-04-16 21:06:22 +0000 UTC" firstStartedPulling="2026-04-16 21:06:22.640625369 +0000 UTC m=+510.430242365" lastFinishedPulling="2026-04-16 21:06:28.743324605 +0000 UTC m=+516.532941591" observedRunningTime="2026-04-16 21:06:29.217390748 +0000 UTC m=+517.007007755" watchObservedRunningTime="2026-04-16 21:06:29.219638414 +0000 UTC m=+517.009255418" Apr 16 21:06:40.201094 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:40.201017 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" Apr 16 21:06:41.330960 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.330926 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z"] Apr 16 21:06:41.334080 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.334058 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z" Apr 16 21:06:41.355009 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.354978 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z"] Apr 16 21:06:41.440542 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.440501 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kldzh\" (UniqueName: \"kubernetes.io/projected/ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc-kube-api-access-kldzh\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z\" (UID: \"ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z" Apr 16 21:06:41.440710 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.440593 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z\" (UID: \"ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z" Apr 16 21:06:41.443286 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.443256 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z"] Apr 16 21:06:41.443480 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:06:41.443459 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[extensions-socket-volume kube-api-access-kldzh], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z" podUID="ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc" Apr 16 21:06:41.541397 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.541363 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kldzh\" (UniqueName: \"kubernetes.io/projected/ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc-kube-api-access-kldzh\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z\" (UID: \"ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z" Apr 16 21:06:41.541591 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.541479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z\" (UID: \"ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z" Apr 16 21:06:41.541815 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.541798 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z\" (UID: \"ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z" Apr 16 21:06:41.561970 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.561934 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kldzh\" (UniqueName: \"kubernetes.io/projected/ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc-kube-api-access-kldzh\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z\" (UID: \"ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z" Apr 16 21:06:41.890909 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.890873 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95"] Apr 16 21:06:41.891110 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.891084 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" podUID="013fbf1d-9e6a-4b37-918a-546e3a10cd1f" containerName="manager" containerID="cri-o://40774251735d364e891fb75a428c5876c65a907f6c5639fde2041dcce1322446" gracePeriod=2 Apr 16 21:06:41.902333 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.902307 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95"] Apr 16 21:06:41.911576 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.911550 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z"] Apr 16 21:06:41.920663 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.920632 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn"] Apr 16 21:06:41.921065 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.921041 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="013fbf1d-9e6a-4b37-918a-546e3a10cd1f" containerName="manager" Apr 16 21:06:41.921065 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.921065 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="013fbf1d-9e6a-4b37-918a-546e3a10cd1f" containerName="manager" Apr 16 21:06:41.921211 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.921176 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="013fbf1d-9e6a-4b37-918a-546e3a10cd1f" containerName="manager" Apr 16 21:06:41.924102 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.924080 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z"] Apr 16 21:06:41.924186 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.924181 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" Apr 16 21:06:41.934342 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.934315 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn"] Apr 16 21:06:41.945009 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.944980 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxgck\" (UniqueName: \"kubernetes.io/projected/0624e254-5027-4211-b0f0-ddfcd127106a-kube-api-access-vxgck\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cz6xn\" (UID: \"0624e254-5027-4211-b0f0-ddfcd127106a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" Apr 16 21:06:41.945142 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.945017 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0624e254-5027-4211-b0f0-ddfcd127106a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cz6xn\" (UID: \"0624e254-5027-4211-b0f0-ddfcd127106a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" Apr 16 21:06:41.989411 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.989375 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv"] Apr 16 21:06:41.992570 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:41.992548 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" Apr 16 21:06:42.011165 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.011138 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv"] Apr 16 21:06:42.046068 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.046032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxgck\" (UniqueName: \"kubernetes.io/projected/0624e254-5027-4211-b0f0-ddfcd127106a-kube-api-access-vxgck\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cz6xn\" (UID: \"0624e254-5027-4211-b0f0-ddfcd127106a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" Apr 16 21:06:42.046245 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.046073 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0624e254-5027-4211-b0f0-ddfcd127106a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cz6xn\" (UID: \"0624e254-5027-4211-b0f0-ddfcd127106a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" Apr 16 21:06:42.046245 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.046109 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/690ff4b2-c170-4503-8d07-340ff416ef06-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-hdwsv\" (UID: \"690ff4b2-c170-4503-8d07-340ff416ef06\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" Apr 16 21:06:42.046245 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.046161 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbvmc\" (UniqueName: \"kubernetes.io/projected/690ff4b2-c170-4503-8d07-340ff416ef06-kube-api-access-xbvmc\") pod \"kuadrant-operator-controller-manager-84b657d985-hdwsv\" (UID: \"690ff4b2-c170-4503-8d07-340ff416ef06\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" Apr 16 21:06:42.046569 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.046544 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0624e254-5027-4211-b0f0-ddfcd127106a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cz6xn\" (UID: \"0624e254-5027-4211-b0f0-ddfcd127106a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" Apr 16 21:06:42.065971 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.065940 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxgck\" (UniqueName: \"kubernetes.io/projected/0624e254-5027-4211-b0f0-ddfcd127106a-kube-api-access-vxgck\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-cz6xn\" (UID: \"0624e254-5027-4211-b0f0-ddfcd127106a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" Apr 16 21:06:42.119250 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.119227 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" Apr 16 21:06:42.122332 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.122307 2575 status_manager.go:895] "Failed to get status for pod" podUID="013fbf1d-9e6a-4b37-918a-546e3a10cd1f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-nks95\" is forbidden: User \"system:node:ip-10-0-141-171.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-171.ec2.internal' and this object" Apr 16 21:06:42.146761 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.146687 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/013fbf1d-9e6a-4b37-918a-546e3a10cd1f-extensions-socket-volume\") pod \"013fbf1d-9e6a-4b37-918a-546e3a10cd1f\" (UID: \"013fbf1d-9e6a-4b37-918a-546e3a10cd1f\") " Apr 16 21:06:42.146761 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.146735 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb95j\" (UniqueName: \"kubernetes.io/projected/013fbf1d-9e6a-4b37-918a-546e3a10cd1f-kube-api-access-qb95j\") pod \"013fbf1d-9e6a-4b37-918a-546e3a10cd1f\" (UID: \"013fbf1d-9e6a-4b37-918a-546e3a10cd1f\") " Apr 16 21:06:42.146951 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.146824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbvmc\" (UniqueName: \"kubernetes.io/projected/690ff4b2-c170-4503-8d07-340ff416ef06-kube-api-access-xbvmc\") pod \"kuadrant-operator-controller-manager-84b657d985-hdwsv\" (UID: \"690ff4b2-c170-4503-8d07-340ff416ef06\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" Apr 16 21:06:42.146951 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.146879 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/690ff4b2-c170-4503-8d07-340ff416ef06-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-hdwsv\" (UID: \"690ff4b2-c170-4503-8d07-340ff416ef06\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" Apr 16 21:06:42.147226 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.147186 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/013fbf1d-9e6a-4b37-918a-546e3a10cd1f-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "013fbf1d-9e6a-4b37-918a-546e3a10cd1f" (UID: "013fbf1d-9e6a-4b37-918a-546e3a10cd1f"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:06:42.147226 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.147203 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/690ff4b2-c170-4503-8d07-340ff416ef06-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-hdwsv\" (UID: \"690ff4b2-c170-4503-8d07-340ff416ef06\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" Apr 16 21:06:42.148970 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.148946 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/013fbf1d-9e6a-4b37-918a-546e3a10cd1f-kube-api-access-qb95j" (OuterVolumeSpecName: "kube-api-access-qb95j") pod "013fbf1d-9e6a-4b37-918a-546e3a10cd1f" (UID: "013fbf1d-9e6a-4b37-918a-546e3a10cd1f"). InnerVolumeSpecName "kube-api-access-qb95j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:06:42.174099 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.174062 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbvmc\" (UniqueName: \"kubernetes.io/projected/690ff4b2-c170-4503-8d07-340ff416ef06-kube-api-access-xbvmc\") pod \"kuadrant-operator-controller-manager-84b657d985-hdwsv\" (UID: \"690ff4b2-c170-4503-8d07-340ff416ef06\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" Apr 16 21:06:42.237891 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.237857 2575 generic.go:358] "Generic (PLEG): container finished" podID="013fbf1d-9e6a-4b37-918a-546e3a10cd1f" containerID="40774251735d364e891fb75a428c5876c65a907f6c5639fde2041dcce1322446" exitCode=0 Apr 16 21:06:42.238053 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.237899 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" Apr 16 21:06:42.238053 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.237952 2575 scope.go:117] "RemoveContainer" containerID="40774251735d364e891fb75a428c5876c65a907f6c5639fde2041dcce1322446" Apr 16 21:06:42.238158 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.238121 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z" Apr 16 21:06:42.241048 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.241004 2575 status_manager.go:895] "Failed to get status for pod" podUID="013fbf1d-9e6a-4b37-918a-546e3a10cd1f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-nks95\" is forbidden: User \"system:node:ip-10-0-141-171.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-171.ec2.internal' and this object" Apr 16 21:06:42.243448 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.243416 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z" Apr 16 21:06:42.243817 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.243794 2575 status_manager.go:895] "Failed to get status for pod" podUID="ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z\" is forbidden: User \"system:node:ip-10-0-141-171.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-171.ec2.internal' and this object" Apr 16 21:06:42.246319 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.246281 2575 status_manager.go:895] "Failed to get status for pod" podUID="013fbf1d-9e6a-4b37-918a-546e3a10cd1f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-nks95\" is forbidden: User \"system:node:ip-10-0-141-171.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-171.ec2.internal' and this object" Apr 16 21:06:42.248066 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.248045 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qb95j\" (UniqueName: \"kubernetes.io/projected/013fbf1d-9e6a-4b37-918a-546e3a10cd1f-kube-api-access-qb95j\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:06:42.248066 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.248067 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/013fbf1d-9e6a-4b37-918a-546e3a10cd1f-extensions-socket-volume\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:06:42.248473 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.248448 2575 scope.go:117] "RemoveContainer" containerID="40774251735d364e891fb75a428c5876c65a907f6c5639fde2041dcce1322446" Apr 16 21:06:42.248727 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:06:42.248707 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40774251735d364e891fb75a428c5876c65a907f6c5639fde2041dcce1322446\": container with ID starting with 40774251735d364e891fb75a428c5876c65a907f6c5639fde2041dcce1322446 not found: ID does not exist" containerID="40774251735d364e891fb75a428c5876c65a907f6c5639fde2041dcce1322446" Apr 16 21:06:42.248792 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.248733 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40774251735d364e891fb75a428c5876c65a907f6c5639fde2041dcce1322446"} err="failed to get container status \"40774251735d364e891fb75a428c5876c65a907f6c5639fde2041dcce1322446\": rpc error: code = NotFound desc = could not find container \"40774251735d364e891fb75a428c5876c65a907f6c5639fde2041dcce1322446\": container with ID starting with 40774251735d364e891fb75a428c5876c65a907f6c5639fde2041dcce1322446 not found: ID does not exist" Apr 16 21:06:42.248858 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.248824 2575 status_manager.go:895] "Failed to get status for pod" podUID="013fbf1d-9e6a-4b37-918a-546e3a10cd1f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-nks95\" is forbidden: User \"system:node:ip-10-0-141-171.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-171.ec2.internal' and this object" Apr 16 21:06:42.251414 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.251389 2575 status_manager.go:895] "Failed to get status for pod" podUID="ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z\" is forbidden: User \"system:node:ip-10-0-141-171.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-171.ec2.internal' and this object" Apr 16 21:06:42.253831 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.253798 2575 status_manager.go:895] "Failed to get status for pod" podUID="013fbf1d-9e6a-4b37-918a-546e3a10cd1f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-nks95\" is forbidden: User \"system:node:ip-10-0-141-171.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-171.ec2.internal' and this object" Apr 16 21:06:42.256240 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.256219 2575 status_manager.go:895] "Failed to get status for pod" podUID="ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z\" is forbidden: User \"system:node:ip-10-0-141-171.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-171.ec2.internal' and this object" Apr 16 21:06:42.268980 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.268959 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" Apr 16 21:06:42.302710 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.302665 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" Apr 16 21:06:42.348600 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.348370 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc-extensions-socket-volume\") pod \"ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc\" (UID: \"ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc\") " Apr 16 21:06:42.348600 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.348503 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kldzh\" (UniqueName: \"kubernetes.io/projected/ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc-kube-api-access-kldzh\") pod \"ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc\" (UID: \"ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc\") " Apr 16 21:06:42.349004 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.348698 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc" (UID: "ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:06:42.350958 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.350922 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc-kube-api-access-kldzh" (OuterVolumeSpecName: "kube-api-access-kldzh") pod "ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc" (UID: "ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc"). InnerVolumeSpecName "kube-api-access-kldzh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:06:42.412305 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.411998 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn"] Apr 16 21:06:42.414975 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:06:42.414939 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0624e254_5027_4211_b0f0_ddfcd127106a.slice/crio-01c1bd7cc0e101eccd017bc48f50ed08e8454aa59f29b616819817a301511dab WatchSource:0}: Error finding container 01c1bd7cc0e101eccd017bc48f50ed08e8454aa59f29b616819817a301511dab: Status 404 returned error can't find the container with id 01c1bd7cc0e101eccd017bc48f50ed08e8454aa59f29b616819817a301511dab Apr 16 21:06:42.449391 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.449368 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kldzh\" (UniqueName: \"kubernetes.io/projected/ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc-kube-api-access-kldzh\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:06:42.449391 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.449393 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc-extensions-socket-volume\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:06:42.462751 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.462729 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv"] Apr 16 21:06:42.465361 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:06:42.465337 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod690ff4b2_c170_4503_8d07_340ff416ef06.slice/crio-033f9731a9d73ca6555de02b017b2d92fd94f44ceb692a74a3014943bfe459ed WatchSource:0}: Error finding container 033f9731a9d73ca6555de02b017b2d92fd94f44ceb692a74a3014943bfe459ed: Status 404 returned error can't find the container with id 033f9731a9d73ca6555de02b017b2d92fd94f44ceb692a74a3014943bfe459ed Apr 16 21:06:42.720199 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.720116 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="013fbf1d-9e6a-4b37-918a-546e3a10cd1f" path="/var/lib/kubelet/pods/013fbf1d-9e6a-4b37-918a-546e3a10cd1f/volumes" Apr 16 21:06:42.720515 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.720493 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc" path="/var/lib/kubelet/pods/ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc/volumes" Apr 16 21:06:42.772198 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.772156 2575 status_manager.go:895] "Failed to get status for pod" podUID="013fbf1d-9e6a-4b37-918a-546e3a10cd1f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nks95" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-nks95\" is forbidden: User \"system:node:ip-10-0-141-171.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-171.ec2.internal' and this object" Apr 16 21:06:42.774494 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:42.774429 2575 status_manager.go:895] "Failed to get status for pod" podUID="ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z\" is forbidden: User \"system:node:ip-10-0-141-171.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-171.ec2.internal' and this object" Apr 16 21:06:43.242777 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:43.242740 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" event={"ID":"690ff4b2-c170-4503-8d07-340ff416ef06","Type":"ContainerStarted","Data":"6d8645dcd6e63a11454a0a474dbd776fe2d24b9c470da390df257142ce6d6483"} Apr 16 21:06:43.242777 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:43.242780 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" event={"ID":"690ff4b2-c170-4503-8d07-340ff416ef06","Type":"ContainerStarted","Data":"033f9731a9d73ca6555de02b017b2d92fd94f44ceb692a74a3014943bfe459ed"} Apr 16 21:06:43.243004 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:43.242838 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" Apr 16 21:06:43.244184 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:43.244158 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" event={"ID":"0624e254-5027-4211-b0f0-ddfcd127106a","Type":"ContainerStarted","Data":"069c7cd4b4e720146ca35df70ea329940f69505619b4edb2e2f6ce15919f74ce"} Apr 16 21:06:43.244184 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:43.244182 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" event={"ID":"0624e254-5027-4211-b0f0-ddfcd127106a","Type":"ContainerStarted","Data":"01c1bd7cc0e101eccd017bc48f50ed08e8454aa59f29b616819817a301511dab"} Apr 16 21:06:43.244359 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:43.244267 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" Apr 16 21:06:43.245001 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:43.244984 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z" Apr 16 21:06:43.268058 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:43.268011 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" podStartSLOduration=2.267998148 podStartE2EDuration="2.267998148s" podCreationTimestamp="2026-04-16 21:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:06:43.265732223 +0000 UTC m=+531.055349256" watchObservedRunningTime="2026-04-16 21:06:43.267998148 +0000 UTC m=+531.057615154" Apr 16 21:06:43.268355 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:43.268333 2575 status_manager.go:895] "Failed to get status for pod" podUID="ec15c1d1-6a5b-4efb-8e99-2fbde94e90cc" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-h8g8z\" is forbidden: User \"system:node:ip-10-0-141-171.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-171.ec2.internal' and this object" Apr 16 21:06:43.294015 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:43.293967 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" podStartSLOduration=2.293950592 podStartE2EDuration="2.293950592s" podCreationTimestamp="2026-04-16 21:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:06:43.292562907 +0000 UTC m=+531.082179911" watchObservedRunningTime="2026-04-16 21:06:43.293950592 +0000 UTC m=+531.083567597" Apr 16 21:06:54.250429 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.250396 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" Apr 16 21:06:54.250943 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.250508 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" Apr 16 21:06:54.337861 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.337825 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn"] Apr 16 21:06:54.338049 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.338028 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" podUID="0624e254-5027-4211-b0f0-ddfcd127106a" containerName="manager" containerID="cri-o://069c7cd4b4e720146ca35df70ea329940f69505619b4edb2e2f6ce15919f74ce" gracePeriod=10 Apr 16 21:06:54.577328 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.577307 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" Apr 16 21:06:54.645405 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.645374 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxgck\" (UniqueName: \"kubernetes.io/projected/0624e254-5027-4211-b0f0-ddfcd127106a-kube-api-access-vxgck\") pod \"0624e254-5027-4211-b0f0-ddfcd127106a\" (UID: \"0624e254-5027-4211-b0f0-ddfcd127106a\") " Apr 16 21:06:54.645591 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.645461 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0624e254-5027-4211-b0f0-ddfcd127106a-extensions-socket-volume\") pod \"0624e254-5027-4211-b0f0-ddfcd127106a\" (UID: \"0624e254-5027-4211-b0f0-ddfcd127106a\") " Apr 16 21:06:54.645839 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.645817 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0624e254-5027-4211-b0f0-ddfcd127106a-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "0624e254-5027-4211-b0f0-ddfcd127106a" (UID: "0624e254-5027-4211-b0f0-ddfcd127106a"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:06:54.647452 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.647413 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0624e254-5027-4211-b0f0-ddfcd127106a-kube-api-access-vxgck" (OuterVolumeSpecName: "kube-api-access-vxgck") pod "0624e254-5027-4211-b0f0-ddfcd127106a" (UID: "0624e254-5027-4211-b0f0-ddfcd127106a"). InnerVolumeSpecName "kube-api-access-vxgck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:06:54.746985 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.746956 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0624e254-5027-4211-b0f0-ddfcd127106a-extensions-socket-volume\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:06:54.746985 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.746981 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vxgck\" (UniqueName: \"kubernetes.io/projected/0624e254-5027-4211-b0f0-ddfcd127106a-kube-api-access-vxgck\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:06:54.777633 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.777562 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-z8frn"] Apr 16 21:06:54.777964 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.777949 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0624e254-5027-4211-b0f0-ddfcd127106a" containerName="manager" Apr 16 21:06:54.778026 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.777966 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0624e254-5027-4211-b0f0-ddfcd127106a" containerName="manager" Apr 16 21:06:54.778069 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.778033 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0624e254-5027-4211-b0f0-ddfcd127106a" containerName="manager" Apr 16 21:06:54.789024 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.788990 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-z8frn" Apr 16 21:06:54.791598 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.791572 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-z8frn"] Apr 16 21:06:54.847570 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.847525 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmttr\" (UniqueName: \"kubernetes.io/projected/e15f0b1a-9607-4abd-a1bf-04eb18545b11-kube-api-access-pmttr\") pod \"kuadrant-operator-controller-manager-55c7f4c975-z8frn\" (UID: \"e15f0b1a-9607-4abd-a1bf-04eb18545b11\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-z8frn" Apr 16 21:06:54.847748 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.847581 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e15f0b1a-9607-4abd-a1bf-04eb18545b11-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-z8frn\" (UID: \"e15f0b1a-9607-4abd-a1bf-04eb18545b11\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-z8frn" Apr 16 21:06:54.948862 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.948818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e15f0b1a-9607-4abd-a1bf-04eb18545b11-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-z8frn\" (UID: \"e15f0b1a-9607-4abd-a1bf-04eb18545b11\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-z8frn" Apr 16 21:06:54.948986 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.948913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmttr\" (UniqueName: \"kubernetes.io/projected/e15f0b1a-9607-4abd-a1bf-04eb18545b11-kube-api-access-pmttr\") pod \"kuadrant-operator-controller-manager-55c7f4c975-z8frn\" (UID: \"e15f0b1a-9607-4abd-a1bf-04eb18545b11\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-z8frn" Apr 16 21:06:54.949217 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.949196 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e15f0b1a-9607-4abd-a1bf-04eb18545b11-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-z8frn\" (UID: \"e15f0b1a-9607-4abd-a1bf-04eb18545b11\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-z8frn" Apr 16 21:06:54.958316 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:54.958297 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmttr\" (UniqueName: \"kubernetes.io/projected/e15f0b1a-9607-4abd-a1bf-04eb18545b11-kube-api-access-pmttr\") pod \"kuadrant-operator-controller-manager-55c7f4c975-z8frn\" (UID: \"e15f0b1a-9607-4abd-a1bf-04eb18545b11\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-z8frn" Apr 16 21:06:55.098804 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:55.098717 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-z8frn" Apr 16 21:06:55.229504 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:55.229471 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-z8frn"] Apr 16 21:06:55.231909 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:06:55.231875 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode15f0b1a_9607_4abd_a1bf_04eb18545b11.slice/crio-32862dcff562589e7ccbd8a7e8882ffb92db81c7514a791861f8d3c3822dbef7 WatchSource:0}: Error finding container 32862dcff562589e7ccbd8a7e8882ffb92db81c7514a791861f8d3c3822dbef7: Status 404 returned error can't find the container with id 32862dcff562589e7ccbd8a7e8882ffb92db81c7514a791861f8d3c3822dbef7 Apr 16 21:06:55.282798 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:55.282768 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-z8frn" event={"ID":"e15f0b1a-9607-4abd-a1bf-04eb18545b11","Type":"ContainerStarted","Data":"32862dcff562589e7ccbd8a7e8882ffb92db81c7514a791861f8d3c3822dbef7"} Apr 16 21:06:55.283855 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:55.283831 2575 generic.go:358] "Generic (PLEG): container finished" podID="0624e254-5027-4211-b0f0-ddfcd127106a" containerID="069c7cd4b4e720146ca35df70ea329940f69505619b4edb2e2f6ce15919f74ce" exitCode=0 Apr 16 21:06:55.283929 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:55.283888 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" event={"ID":"0624e254-5027-4211-b0f0-ddfcd127106a","Type":"ContainerDied","Data":"069c7cd4b4e720146ca35df70ea329940f69505619b4edb2e2f6ce15919f74ce"} Apr 16 21:06:55.283929 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:55.283918 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" event={"ID":"0624e254-5027-4211-b0f0-ddfcd127106a","Type":"ContainerDied","Data":"01c1bd7cc0e101eccd017bc48f50ed08e8454aa59f29b616819817a301511dab"} Apr 16 21:06:55.283991 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:55.283927 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn" Apr 16 21:06:55.283991 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:55.283937 2575 scope.go:117] "RemoveContainer" containerID="069c7cd4b4e720146ca35df70ea329940f69505619b4edb2e2f6ce15919f74ce" Apr 16 21:06:55.292057 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:55.292034 2575 scope.go:117] "RemoveContainer" containerID="069c7cd4b4e720146ca35df70ea329940f69505619b4edb2e2f6ce15919f74ce" Apr 16 21:06:55.292350 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:06:55.292330 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"069c7cd4b4e720146ca35df70ea329940f69505619b4edb2e2f6ce15919f74ce\": container with ID starting with 069c7cd4b4e720146ca35df70ea329940f69505619b4edb2e2f6ce15919f74ce not found: ID does not exist" containerID="069c7cd4b4e720146ca35df70ea329940f69505619b4edb2e2f6ce15919f74ce" Apr 16 21:06:55.292417 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:55.292361 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"069c7cd4b4e720146ca35df70ea329940f69505619b4edb2e2f6ce15919f74ce"} err="failed to get container status \"069c7cd4b4e720146ca35df70ea329940f69505619b4edb2e2f6ce15919f74ce\": rpc error: code = NotFound desc = could not find container \"069c7cd4b4e720146ca35df70ea329940f69505619b4edb2e2f6ce15919f74ce\": container with ID starting with 069c7cd4b4e720146ca35df70ea329940f69505619b4edb2e2f6ce15919f74ce not found: ID does not exist" Apr 16 21:06:55.307541 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:55.307517 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn"] Apr 16 21:06:55.323165 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:55.323142 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-cz6xn"] Apr 16 21:06:56.289062 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:56.289026 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-z8frn" event={"ID":"e15f0b1a-9607-4abd-a1bf-04eb18545b11","Type":"ContainerStarted","Data":"2fc6651666b633740fd3dd8497f39b2dc8f9ed18b37923da7e7211bd4e38584b"} Apr 16 21:06:56.289529 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:56.289141 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-z8frn" Apr 16 21:06:56.313248 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:56.313196 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-z8frn" podStartSLOduration=2.313184449 podStartE2EDuration="2.313184449s" podCreationTimestamp="2026-04-16 21:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:06:56.311735602 +0000 UTC m=+544.101352608" watchObservedRunningTime="2026-04-16 21:06:56.313184449 +0000 UTC m=+544.102801453" Apr 16 21:06:56.719070 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:06:56.719035 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0624e254-5027-4211-b0f0-ddfcd127106a" path="/var/lib/kubelet/pods/0624e254-5027-4211-b0f0-ddfcd127106a/volumes" Apr 16 21:07:07.295218 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:07.295188 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-z8frn" Apr 16 21:07:07.348663 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:07.348634 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv"] Apr 16 21:07:07.348892 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:07.348868 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" podUID="690ff4b2-c170-4503-8d07-340ff416ef06" containerName="manager" containerID="cri-o://6d8645dcd6e63a11454a0a474dbd776fe2d24b9c470da390df257142ce6d6483" gracePeriod=10 Apr 16 21:07:07.585894 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:07.585870 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" Apr 16 21:07:07.644492 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:07.644457 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/690ff4b2-c170-4503-8d07-340ff416ef06-extensions-socket-volume\") pod \"690ff4b2-c170-4503-8d07-340ff416ef06\" (UID: \"690ff4b2-c170-4503-8d07-340ff416ef06\") " Apr 16 21:07:07.644671 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:07.644520 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbvmc\" (UniqueName: \"kubernetes.io/projected/690ff4b2-c170-4503-8d07-340ff416ef06-kube-api-access-xbvmc\") pod \"690ff4b2-c170-4503-8d07-340ff416ef06\" (UID: \"690ff4b2-c170-4503-8d07-340ff416ef06\") " Apr 16 21:07:07.644901 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:07.644875 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/690ff4b2-c170-4503-8d07-340ff416ef06-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "690ff4b2-c170-4503-8d07-340ff416ef06" (UID: "690ff4b2-c170-4503-8d07-340ff416ef06"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:07:07.646538 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:07.646517 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/690ff4b2-c170-4503-8d07-340ff416ef06-kube-api-access-xbvmc" (OuterVolumeSpecName: "kube-api-access-xbvmc") pod "690ff4b2-c170-4503-8d07-340ff416ef06" (UID: "690ff4b2-c170-4503-8d07-340ff416ef06"). InnerVolumeSpecName "kube-api-access-xbvmc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:07:07.745705 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:07.745673 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/690ff4b2-c170-4503-8d07-340ff416ef06-extensions-socket-volume\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:07:07.745705 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:07.745700 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xbvmc\" (UniqueName: \"kubernetes.io/projected/690ff4b2-c170-4503-8d07-340ff416ef06-kube-api-access-xbvmc\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:07:08.327081 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:08.327043 2575 generic.go:358] "Generic (PLEG): container finished" podID="690ff4b2-c170-4503-8d07-340ff416ef06" containerID="6d8645dcd6e63a11454a0a474dbd776fe2d24b9c470da390df257142ce6d6483" exitCode=0 Apr 16 21:07:08.327554 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:08.327133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" event={"ID":"690ff4b2-c170-4503-8d07-340ff416ef06","Type":"ContainerDied","Data":"6d8645dcd6e63a11454a0a474dbd776fe2d24b9c470da390df257142ce6d6483"} Apr 16 21:07:08.327554 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:08.327173 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" event={"ID":"690ff4b2-c170-4503-8d07-340ff416ef06","Type":"ContainerDied","Data":"033f9731a9d73ca6555de02b017b2d92fd94f44ceb692a74a3014943bfe459ed"} Apr 16 21:07:08.327554 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:08.327188 2575 scope.go:117] "RemoveContainer" containerID="6d8645dcd6e63a11454a0a474dbd776fe2d24b9c470da390df257142ce6d6483" Apr 16 21:07:08.327554 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:08.327145 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv" Apr 16 21:07:08.336684 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:08.336541 2575 scope.go:117] "RemoveContainer" containerID="6d8645dcd6e63a11454a0a474dbd776fe2d24b9c470da390df257142ce6d6483" Apr 16 21:07:08.337030 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:07:08.337007 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d8645dcd6e63a11454a0a474dbd776fe2d24b9c470da390df257142ce6d6483\": container with ID starting with 6d8645dcd6e63a11454a0a474dbd776fe2d24b9c470da390df257142ce6d6483 not found: ID does not exist" containerID="6d8645dcd6e63a11454a0a474dbd776fe2d24b9c470da390df257142ce6d6483" Apr 16 21:07:08.337089 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:08.337041 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d8645dcd6e63a11454a0a474dbd776fe2d24b9c470da390df257142ce6d6483"} err="failed to get container status \"6d8645dcd6e63a11454a0a474dbd776fe2d24b9c470da390df257142ce6d6483\": rpc error: code = NotFound desc = could not find container \"6d8645dcd6e63a11454a0a474dbd776fe2d24b9c470da390df257142ce6d6483\": container with ID starting with 6d8645dcd6e63a11454a0a474dbd776fe2d24b9c470da390df257142ce6d6483 not found: ID does not exist" Apr 16 21:07:08.350995 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:08.350970 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv"] Apr 16 21:07:08.357900 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:08.357877 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hdwsv"] Apr 16 21:07:08.719281 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:08.719251 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="690ff4b2-c170-4503-8d07-340ff416ef06" path="/var/lib/kubelet/pods/690ff4b2-c170-4503-8d07-340ff416ef06/volumes" Apr 16 21:07:26.666004 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:26.665966 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-q7d5f"] Apr 16 21:07:26.666475 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:26.666259 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="690ff4b2-c170-4503-8d07-340ff416ef06" containerName="manager" Apr 16 21:07:26.666475 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:26.666270 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="690ff4b2-c170-4503-8d07-340ff416ef06" containerName="manager" Apr 16 21:07:26.666475 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:26.666317 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="690ff4b2-c170-4503-8d07-340ff416ef06" containerName="manager" Apr 16 21:07:26.669133 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:26.669115 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" Apr 16 21:07:26.672062 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:26.672037 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 21:07:26.672197 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:26.672179 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-ljsfg\"" Apr 16 21:07:26.677306 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:26.677281 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-q7d5f"] Apr 16 21:07:26.761849 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:26.761815 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-q7d5f"] Apr 16 21:07:26.793305 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:26.793269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdjjs\" (UniqueName: \"kubernetes.io/projected/d1608b86-1993-4340-92cb-06f480234e95-kube-api-access-sdjjs\") pod \"limitador-limitador-7d549b5b-q7d5f\" (UID: \"d1608b86-1993-4340-92cb-06f480234e95\") " pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" Apr 16 21:07:26.793485 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:26.793321 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d1608b86-1993-4340-92cb-06f480234e95-config-file\") pod \"limitador-limitador-7d549b5b-q7d5f\" (UID: \"d1608b86-1993-4340-92cb-06f480234e95\") " pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" Apr 16 21:07:26.894125 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:26.894088 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdjjs\" (UniqueName: \"kubernetes.io/projected/d1608b86-1993-4340-92cb-06f480234e95-kube-api-access-sdjjs\") pod \"limitador-limitador-7d549b5b-q7d5f\" (UID: \"d1608b86-1993-4340-92cb-06f480234e95\") " pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" Apr 16 21:07:26.894125 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:26.894127 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d1608b86-1993-4340-92cb-06f480234e95-config-file\") pod \"limitador-limitador-7d549b5b-q7d5f\" (UID: \"d1608b86-1993-4340-92cb-06f480234e95\") " pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" Apr 16 21:07:26.894800 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:26.894777 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d1608b86-1993-4340-92cb-06f480234e95-config-file\") pod \"limitador-limitador-7d549b5b-q7d5f\" (UID: \"d1608b86-1993-4340-92cb-06f480234e95\") " pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" Apr 16 21:07:26.902031 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:26.902008 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdjjs\" (UniqueName: \"kubernetes.io/projected/d1608b86-1993-4340-92cb-06f480234e95-kube-api-access-sdjjs\") pod \"limitador-limitador-7d549b5b-q7d5f\" (UID: \"d1608b86-1993-4340-92cb-06f480234e95\") " pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" Apr 16 21:07:26.979336 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:26.979236 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" Apr 16 21:07:27.107415 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:27.107389 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-q7d5f"] Apr 16 21:07:27.110000 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:07:27.109971 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1608b86_1993_4340_92cb_06f480234e95.slice/crio-696d43e6a3c3d0ce903f15abb4d6495008dca74deddd413e3f5afad53a006bc3 WatchSource:0}: Error finding container 696d43e6a3c3d0ce903f15abb4d6495008dca74deddd413e3f5afad53a006bc3: Status 404 returned error can't find the container with id 696d43e6a3c3d0ce903f15abb4d6495008dca74deddd413e3f5afad53a006bc3 Apr 16 21:07:27.388712 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:27.388621 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" event={"ID":"d1608b86-1993-4340-92cb-06f480234e95","Type":"ContainerStarted","Data":"696d43e6a3c3d0ce903f15abb4d6495008dca74deddd413e3f5afad53a006bc3"} Apr 16 21:07:30.399078 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:30.399044 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" event={"ID":"d1608b86-1993-4340-92cb-06f480234e95","Type":"ContainerStarted","Data":"2d4ecc88fa27a3e6d5577267419060e0d16adf59c0c143af0445ce663b3ee48f"} Apr 16 21:07:30.399465 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:30.399212 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" Apr 16 21:07:30.417903 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:30.417858 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" podStartSLOduration=1.783118699 podStartE2EDuration="4.417844125s" podCreationTimestamp="2026-04-16 21:07:26 +0000 UTC" firstStartedPulling="2026-04-16 21:07:27.111850891 +0000 UTC m=+574.901467879" lastFinishedPulling="2026-04-16 21:07:29.746576309 +0000 UTC m=+577.536193305" observedRunningTime="2026-04-16 21:07:30.415961656 +0000 UTC m=+578.205578653" watchObservedRunningTime="2026-04-16 21:07:30.417844125 +0000 UTC m=+578.207461129" Apr 16 21:07:41.290266 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:41.290232 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-q7d5f"] Apr 16 21:07:41.290770 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:41.290493 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" podUID="d1608b86-1993-4340-92cb-06f480234e95" containerName="limitador" containerID="cri-o://2d4ecc88fa27a3e6d5577267419060e0d16adf59c0c143af0445ce663b3ee48f" gracePeriod=30 Apr 16 21:07:41.291332 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:41.291294 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" Apr 16 21:07:41.826726 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:41.826703 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" Apr 16 21:07:41.923366 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:41.923338 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d1608b86-1993-4340-92cb-06f480234e95-config-file\") pod \"d1608b86-1993-4340-92cb-06f480234e95\" (UID: \"d1608b86-1993-4340-92cb-06f480234e95\") " Apr 16 21:07:41.923560 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:41.923378 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdjjs\" (UniqueName: \"kubernetes.io/projected/d1608b86-1993-4340-92cb-06f480234e95-kube-api-access-sdjjs\") pod \"d1608b86-1993-4340-92cb-06f480234e95\" (UID: \"d1608b86-1993-4340-92cb-06f480234e95\") " Apr 16 21:07:41.923738 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:41.923713 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1608b86-1993-4340-92cb-06f480234e95-config-file" (OuterVolumeSpecName: "config-file") pod "d1608b86-1993-4340-92cb-06f480234e95" (UID: "d1608b86-1993-4340-92cb-06f480234e95"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:07:41.925474 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:41.925418 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1608b86-1993-4340-92cb-06f480234e95-kube-api-access-sdjjs" (OuterVolumeSpecName: "kube-api-access-sdjjs") pod "d1608b86-1993-4340-92cb-06f480234e95" (UID: "d1608b86-1993-4340-92cb-06f480234e95"). InnerVolumeSpecName "kube-api-access-sdjjs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:07:42.024769 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:42.024728 2575 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d1608b86-1993-4340-92cb-06f480234e95-config-file\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:07:42.024769 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:42.024759 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sdjjs\" (UniqueName: \"kubernetes.io/projected/d1608b86-1993-4340-92cb-06f480234e95-kube-api-access-sdjjs\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:07:42.440179 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:42.440143 2575 generic.go:358] "Generic (PLEG): container finished" podID="d1608b86-1993-4340-92cb-06f480234e95" containerID="2d4ecc88fa27a3e6d5577267419060e0d16adf59c0c143af0445ce663b3ee48f" exitCode=0 Apr 16 21:07:42.440659 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:42.440184 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" event={"ID":"d1608b86-1993-4340-92cb-06f480234e95","Type":"ContainerDied","Data":"2d4ecc88fa27a3e6d5577267419060e0d16adf59c0c143af0445ce663b3ee48f"} Apr 16 21:07:42.440659 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:42.440207 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" event={"ID":"d1608b86-1993-4340-92cb-06f480234e95","Type":"ContainerDied","Data":"696d43e6a3c3d0ce903f15abb4d6495008dca74deddd413e3f5afad53a006bc3"} Apr 16 21:07:42.440659 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:42.440205 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-q7d5f" Apr 16 21:07:42.440659 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:42.440220 2575 scope.go:117] "RemoveContainer" containerID="2d4ecc88fa27a3e6d5577267419060e0d16adf59c0c143af0445ce663b3ee48f" Apr 16 21:07:42.448490 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:42.448471 2575 scope.go:117] "RemoveContainer" containerID="2d4ecc88fa27a3e6d5577267419060e0d16adf59c0c143af0445ce663b3ee48f" Apr 16 21:07:42.448740 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:07:42.448716 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d4ecc88fa27a3e6d5577267419060e0d16adf59c0c143af0445ce663b3ee48f\": container with ID starting with 2d4ecc88fa27a3e6d5577267419060e0d16adf59c0c143af0445ce663b3ee48f not found: ID does not exist" containerID="2d4ecc88fa27a3e6d5577267419060e0d16adf59c0c143af0445ce663b3ee48f" Apr 16 21:07:42.448828 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:42.448746 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d4ecc88fa27a3e6d5577267419060e0d16adf59c0c143af0445ce663b3ee48f"} err="failed to get container status \"2d4ecc88fa27a3e6d5577267419060e0d16adf59c0c143af0445ce663b3ee48f\": rpc error: code = NotFound desc = could not find container \"2d4ecc88fa27a3e6d5577267419060e0d16adf59c0c143af0445ce663b3ee48f\": container with ID starting with 2d4ecc88fa27a3e6d5577267419060e0d16adf59c0c143af0445ce663b3ee48f not found: ID does not exist" Apr 16 21:07:42.462598 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:42.462577 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-q7d5f"] Apr 16 21:07:42.473765 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:42.473745 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-q7d5f"] Apr 16 21:07:42.719580 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:42.719504 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1608b86-1993-4340-92cb-06f480234e95" path="/var/lib/kubelet/pods/d1608b86-1993-4340-92cb-06f480234e95/volumes" Apr 16 21:07:46.320808 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:46.320778 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-tpg9j"] Apr 16 21:07:46.321219 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:46.321107 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1608b86-1993-4340-92cb-06f480234e95" containerName="limitador" Apr 16 21:07:46.321219 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:46.321123 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1608b86-1993-4340-92cb-06f480234e95" containerName="limitador" Apr 16 21:07:46.321219 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:46.321187 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1608b86-1993-4340-92cb-06f480234e95" containerName="limitador" Apr 16 21:07:46.325551 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:46.325530 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-tpg9j" Apr 16 21:07:46.328235 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:46.328216 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 16 21:07:46.328347 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:46.328297 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-g4rqf\"" Apr 16 21:07:46.333337 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:46.333314 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-tpg9j"] Apr 16 21:07:46.356196 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:46.356171 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c29d\" (UniqueName: \"kubernetes.io/projected/5a89e38d-2beb-446b-a9f7-ddb8c3fb0c32-kube-api-access-2c29d\") pod \"postgres-868db5846d-tpg9j\" (UID: \"5a89e38d-2beb-446b-a9f7-ddb8c3fb0c32\") " pod="opendatahub/postgres-868db5846d-tpg9j" Apr 16 21:07:46.356295 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:46.356218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5a89e38d-2beb-446b-a9f7-ddb8c3fb0c32-data\") pod \"postgres-868db5846d-tpg9j\" (UID: \"5a89e38d-2beb-446b-a9f7-ddb8c3fb0c32\") " pod="opendatahub/postgres-868db5846d-tpg9j" Apr 16 21:07:46.457054 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:46.457016 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2c29d\" (UniqueName: \"kubernetes.io/projected/5a89e38d-2beb-446b-a9f7-ddb8c3fb0c32-kube-api-access-2c29d\") pod \"postgres-868db5846d-tpg9j\" (UID: \"5a89e38d-2beb-446b-a9f7-ddb8c3fb0c32\") " pod="opendatahub/postgres-868db5846d-tpg9j" Apr 16 21:07:46.457271 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:46.457081 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5a89e38d-2beb-446b-a9f7-ddb8c3fb0c32-data\") pod \"postgres-868db5846d-tpg9j\" (UID: \"5a89e38d-2beb-446b-a9f7-ddb8c3fb0c32\") " pod="opendatahub/postgres-868db5846d-tpg9j" Apr 16 21:07:46.457414 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:46.457397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5a89e38d-2beb-446b-a9f7-ddb8c3fb0c32-data\") pod \"postgres-868db5846d-tpg9j\" (UID: \"5a89e38d-2beb-446b-a9f7-ddb8c3fb0c32\") " pod="opendatahub/postgres-868db5846d-tpg9j" Apr 16 21:07:46.465546 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:46.465517 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c29d\" (UniqueName: \"kubernetes.io/projected/5a89e38d-2beb-446b-a9f7-ddb8c3fb0c32-kube-api-access-2c29d\") pod \"postgres-868db5846d-tpg9j\" (UID: \"5a89e38d-2beb-446b-a9f7-ddb8c3fb0c32\") " pod="opendatahub/postgres-868db5846d-tpg9j" Apr 16 21:07:46.636916 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:46.636835 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-tpg9j" Apr 16 21:07:46.758387 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:46.758363 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-tpg9j"] Apr 16 21:07:46.760989 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:07:46.760957 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a89e38d_2beb_446b_a9f7_ddb8c3fb0c32.slice/crio-d27ea0383496e842fd444ee8890475e5b7705d8b091591791ff4ebd7892457ab WatchSource:0}: Error finding container d27ea0383496e842fd444ee8890475e5b7705d8b091591791ff4ebd7892457ab: Status 404 returned error can't find the container with id d27ea0383496e842fd444ee8890475e5b7705d8b091591791ff4ebd7892457ab Apr 16 21:07:47.457891 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:47.457852 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-tpg9j" event={"ID":"5a89e38d-2beb-446b-a9f7-ddb8c3fb0c32","Type":"ContainerStarted","Data":"d27ea0383496e842fd444ee8890475e5b7705d8b091591791ff4ebd7892457ab"} Apr 16 21:07:52.475828 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:52.475787 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-tpg9j" event={"ID":"5a89e38d-2beb-446b-a9f7-ddb8c3fb0c32","Type":"ContainerStarted","Data":"0820fd098107ae7c96532da265943df6d05a5642510bc037d6be6864ce9a3fe5"} Apr 16 21:07:52.476193 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:52.475922 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-tpg9j" Apr 16 21:07:52.494536 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:52.494489 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-tpg9j" podStartSLOduration=1.3042743909999999 podStartE2EDuration="6.494476811s" podCreationTimestamp="2026-04-16 21:07:46 +0000 UTC" firstStartedPulling="2026-04-16 21:07:46.762190236 +0000 UTC m=+594.551807219" lastFinishedPulling="2026-04-16 21:07:51.952392656 +0000 UTC m=+599.742009639" observedRunningTime="2026-04-16 21:07:52.492488124 +0000 UTC m=+600.282105129" watchObservedRunningTime="2026-04-16 21:07:52.494476811 +0000 UTC m=+600.284093816" Apr 16 21:07:52.641788 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:52.641764 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzbn8_ffb608ad-9008-418c-a258-d80cf876c140/ovn-acl-logging/0.log" Apr 16 21:07:52.641960 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:52.641794 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzbn8_ffb608ad-9008-418c-a258-d80cf876c140/ovn-acl-logging/0.log" Apr 16 21:07:58.506869 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:58.506842 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-tpg9j" Apr 16 21:07:59.393211 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.393174 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7cb589bc-6zskn"] Apr 16 21:07:59.398030 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.398007 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7cb589bc-6zskn" Apr 16 21:07:59.400776 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.400750 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 16 21:07:59.400971 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.400944 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 16 21:07:59.401134 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.401119 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-nxn52\"" Apr 16 21:07:59.406493 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.406475 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7cb589bc-6zskn"] Apr 16 21:07:59.412107 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.412087 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6db49c57b6-tsnh5"] Apr 16 21:07:59.415072 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.415057 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6db49c57b6-tsnh5" Apr 16 21:07:59.417971 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.417953 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-t4z2d\"" Apr 16 21:07:59.426547 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.426511 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6db49c57b6-tsnh5"] Apr 16 21:07:59.443026 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.442997 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2jnc\" (UniqueName: \"kubernetes.io/projected/94623ad1-b32e-4749-b015-a444fa49fd32-kube-api-access-z2jnc\") pod \"maas-controller-6db49c57b6-tsnh5\" (UID: \"94623ad1-b32e-4749-b015-a444fa49fd32\") " pod="opendatahub/maas-controller-6db49c57b6-tsnh5" Apr 16 21:07:59.443123 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.443031 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/75fef077-3c22-4371-b4e3-5d2735c414e6-maas-api-tls\") pod \"maas-api-7cb589bc-6zskn\" (UID: \"75fef077-3c22-4371-b4e3-5d2735c414e6\") " pod="opendatahub/maas-api-7cb589bc-6zskn" Apr 16 21:07:59.443123 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.443071 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42sjg\" (UniqueName: \"kubernetes.io/projected/75fef077-3c22-4371-b4e3-5d2735c414e6-kube-api-access-42sjg\") pod \"maas-api-7cb589bc-6zskn\" (UID: \"75fef077-3c22-4371-b4e3-5d2735c414e6\") " pod="opendatahub/maas-api-7cb589bc-6zskn" Apr 16 21:07:59.543713 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.543685 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2jnc\" (UniqueName: \"kubernetes.io/projected/94623ad1-b32e-4749-b015-a444fa49fd32-kube-api-access-z2jnc\") pod \"maas-controller-6db49c57b6-tsnh5\" (UID: \"94623ad1-b32e-4749-b015-a444fa49fd32\") " pod="opendatahub/maas-controller-6db49c57b6-tsnh5" Apr 16 21:07:59.544090 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.543721 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/75fef077-3c22-4371-b4e3-5d2735c414e6-maas-api-tls\") pod \"maas-api-7cb589bc-6zskn\" (UID: \"75fef077-3c22-4371-b4e3-5d2735c414e6\") " pod="opendatahub/maas-api-7cb589bc-6zskn" Apr 16 21:07:59.544090 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.543753 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42sjg\" (UniqueName: \"kubernetes.io/projected/75fef077-3c22-4371-b4e3-5d2735c414e6-kube-api-access-42sjg\") pod \"maas-api-7cb589bc-6zskn\" (UID: \"75fef077-3c22-4371-b4e3-5d2735c414e6\") " pod="opendatahub/maas-api-7cb589bc-6zskn" Apr 16 21:07:59.544090 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:07:59.543846 2575 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 16 21:07:59.544090 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:07:59.543902 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75fef077-3c22-4371-b4e3-5d2735c414e6-maas-api-tls podName:75fef077-3c22-4371-b4e3-5d2735c414e6 nodeName:}" failed. No retries permitted until 2026-04-16 21:08:00.04388541 +0000 UTC m=+607.833502398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/75fef077-3c22-4371-b4e3-5d2735c414e6-maas-api-tls") pod "maas-api-7cb589bc-6zskn" (UID: "75fef077-3c22-4371-b4e3-5d2735c414e6") : secret "maas-api-serving-cert" not found Apr 16 21:07:59.557369 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.557343 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2jnc\" (UniqueName: \"kubernetes.io/projected/94623ad1-b32e-4749-b015-a444fa49fd32-kube-api-access-z2jnc\") pod \"maas-controller-6db49c57b6-tsnh5\" (UID: \"94623ad1-b32e-4749-b015-a444fa49fd32\") " pod="opendatahub/maas-controller-6db49c57b6-tsnh5" Apr 16 21:07:59.557565 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.557547 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42sjg\" (UniqueName: \"kubernetes.io/projected/75fef077-3c22-4371-b4e3-5d2735c414e6-kube-api-access-42sjg\") pod \"maas-api-7cb589bc-6zskn\" (UID: \"75fef077-3c22-4371-b4e3-5d2735c414e6\") " pod="opendatahub/maas-api-7cb589bc-6zskn" Apr 16 21:07:59.727479 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.727423 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6db49c57b6-tsnh5" Apr 16 21:07:59.844289 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:07:59.844217 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6db49c57b6-tsnh5"] Apr 16 21:07:59.846692 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:07:59.846584 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94623ad1_b32e_4749_b015_a444fa49fd32.slice/crio-18033467c8f97c954b5777570294b634791acf072dbf87ed15884cec2cd092b5 WatchSource:0}: Error finding container 18033467c8f97c954b5777570294b634791acf072dbf87ed15884cec2cd092b5: Status 404 returned error can't find the container with id 18033467c8f97c954b5777570294b634791acf072dbf87ed15884cec2cd092b5 Apr 16 21:08:00.048133 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:00.048057 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/75fef077-3c22-4371-b4e3-5d2735c414e6-maas-api-tls\") pod \"maas-api-7cb589bc-6zskn\" (UID: \"75fef077-3c22-4371-b4e3-5d2735c414e6\") " pod="opendatahub/maas-api-7cb589bc-6zskn" Apr 16 21:08:00.050620 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:00.050590 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/75fef077-3c22-4371-b4e3-5d2735c414e6-maas-api-tls\") pod \"maas-api-7cb589bc-6zskn\" (UID: \"75fef077-3c22-4371-b4e3-5d2735c414e6\") " pod="opendatahub/maas-api-7cb589bc-6zskn" Apr 16 21:08:00.307029 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:00.306958 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-57b457866-fpw7z"] Apr 16 21:08:00.308562 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:00.308537 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7cb589bc-6zskn" Apr 16 21:08:00.311413 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:00.311382 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-57b457866-fpw7z" Apr 16 21:08:00.320091 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:00.320054 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-57b457866-fpw7z"] Apr 16 21:08:00.350431 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:00.350397 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8c56b667-b3b8-4540-8d5f-2f00dc11eb59-maas-api-tls\") pod \"maas-api-57b457866-fpw7z\" (UID: \"8c56b667-b3b8-4540-8d5f-2f00dc11eb59\") " pod="opendatahub/maas-api-57b457866-fpw7z" Apr 16 21:08:00.350764 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:00.350727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mp6h\" (UniqueName: \"kubernetes.io/projected/8c56b667-b3b8-4540-8d5f-2f00dc11eb59-kube-api-access-7mp6h\") pod \"maas-api-57b457866-fpw7z\" (UID: \"8c56b667-b3b8-4540-8d5f-2f00dc11eb59\") " pod="opendatahub/maas-api-57b457866-fpw7z" Apr 16 21:08:00.453559 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:00.452548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8c56b667-b3b8-4540-8d5f-2f00dc11eb59-maas-api-tls\") pod \"maas-api-57b457866-fpw7z\" (UID: \"8c56b667-b3b8-4540-8d5f-2f00dc11eb59\") " pod="opendatahub/maas-api-57b457866-fpw7z" Apr 16 21:08:00.453559 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:00.452609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mp6h\" (UniqueName: \"kubernetes.io/projected/8c56b667-b3b8-4540-8d5f-2f00dc11eb59-kube-api-access-7mp6h\") pod \"maas-api-57b457866-fpw7z\" (UID: \"8c56b667-b3b8-4540-8d5f-2f00dc11eb59\") " pod="opendatahub/maas-api-57b457866-fpw7z" Apr 16 21:08:00.456777 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:00.454186 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7cb589bc-6zskn"] Apr 16 21:08:00.456777 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:00.456558 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8c56b667-b3b8-4540-8d5f-2f00dc11eb59-maas-api-tls\") pod \"maas-api-57b457866-fpw7z\" (UID: \"8c56b667-b3b8-4540-8d5f-2f00dc11eb59\") " pod="opendatahub/maas-api-57b457866-fpw7z" Apr 16 21:08:00.466103 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:00.466078 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mp6h\" (UniqueName: \"kubernetes.io/projected/8c56b667-b3b8-4540-8d5f-2f00dc11eb59-kube-api-access-7mp6h\") pod \"maas-api-57b457866-fpw7z\" (UID: \"8c56b667-b3b8-4540-8d5f-2f00dc11eb59\") " pod="opendatahub/maas-api-57b457866-fpw7z" Apr 16 21:08:00.502917 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:00.502883 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6db49c57b6-tsnh5" event={"ID":"94623ad1-b32e-4749-b015-a444fa49fd32","Type":"ContainerStarted","Data":"18033467c8f97c954b5777570294b634791acf072dbf87ed15884cec2cd092b5"} Apr 16 21:08:00.504065 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:00.504039 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7cb589bc-6zskn" event={"ID":"75fef077-3c22-4371-b4e3-5d2735c414e6","Type":"ContainerStarted","Data":"16e2190a5f425131b329a94dea9f26e3422be3b98c0d7e71a941ddf03c087022"} Apr 16 21:08:00.623902 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:00.623812 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-57b457866-fpw7z" Apr 16 21:08:00.782502 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:00.782475 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-57b457866-fpw7z"] Apr 16 21:08:00.784348 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:08:00.784318 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c56b667_b3b8_4540_8d5f_2f00dc11eb59.slice/crio-5a414ac7f44f1847a10e0ec8e711ad18970133c7506cff15ca19f4899edb3d0c WatchSource:0}: Error finding container 5a414ac7f44f1847a10e0ec8e711ad18970133c7506cff15ca19f4899edb3d0c: Status 404 returned error can't find the container with id 5a414ac7f44f1847a10e0ec8e711ad18970133c7506cff15ca19f4899edb3d0c Apr 16 21:08:01.510499 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:01.510431 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-57b457866-fpw7z" event={"ID":"8c56b667-b3b8-4540-8d5f-2f00dc11eb59","Type":"ContainerStarted","Data":"5a414ac7f44f1847a10e0ec8e711ad18970133c7506cff15ca19f4899edb3d0c"} Apr 16 21:08:03.519086 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:03.518993 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-57b457866-fpw7z" event={"ID":"8c56b667-b3b8-4540-8d5f-2f00dc11eb59","Type":"ContainerStarted","Data":"c2afc1de0d5dc8014e381d21629960b649d74bbf720a0dd44b839913b2f0c269"} Apr 16 21:08:03.519508 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:03.519083 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-57b457866-fpw7z" Apr 16 21:08:03.520323 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:03.520298 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7cb589bc-6zskn" event={"ID":"75fef077-3c22-4371-b4e3-5d2735c414e6","Type":"ContainerStarted","Data":"ec2e5926def404d08bad2e251c9928e5a5ef827f5219805aaac09e9023c8670d"} Apr 16 21:08:03.520432 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:03.520359 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7cb589bc-6zskn" Apr 16 21:08:03.521556 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:03.521536 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6db49c57b6-tsnh5" event={"ID":"94623ad1-b32e-4749-b015-a444fa49fd32","Type":"ContainerStarted","Data":"2a1eb1e867dfc8c5853a42ce7ac85d8f0f2b194800e5edeb008e14e900d5ed7a"} Apr 16 21:08:03.521681 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:03.521667 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6db49c57b6-tsnh5" Apr 16 21:08:03.541160 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:03.541107 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-57b457866-fpw7z" podStartSLOduration=1.2080383590000001 podStartE2EDuration="3.541092594s" podCreationTimestamp="2026-04-16 21:08:00 +0000 UTC" firstStartedPulling="2026-04-16 21:08:00.785895067 +0000 UTC m=+608.575512050" lastFinishedPulling="2026-04-16 21:08:03.118949301 +0000 UTC m=+610.908566285" observedRunningTime="2026-04-16 21:08:03.539640333 +0000 UTC m=+611.329257348" watchObservedRunningTime="2026-04-16 21:08:03.541092594 +0000 UTC m=+611.330709598" Apr 16 21:08:03.556527 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:03.556466 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7cb589bc-6zskn" podStartSLOduration=1.900694798 podStartE2EDuration="4.556430578s" podCreationTimestamp="2026-04-16 21:07:59 +0000 UTC" firstStartedPulling="2026-04-16 21:08:00.462642815 +0000 UTC m=+608.252259803" lastFinishedPulling="2026-04-16 21:08:03.118378597 +0000 UTC m=+610.907995583" observedRunningTime="2026-04-16 21:08:03.555646469 +0000 UTC m=+611.345263551" watchObservedRunningTime="2026-04-16 21:08:03.556430578 +0000 UTC m=+611.346047583" Apr 16 21:08:03.573512 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:03.573459 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6db49c57b6-tsnh5" podStartSLOduration=1.303543642 podStartE2EDuration="4.573423326s" podCreationTimestamp="2026-04-16 21:07:59 +0000 UTC" firstStartedPulling="2026-04-16 21:07:59.848137705 +0000 UTC m=+607.637754703" lastFinishedPulling="2026-04-16 21:08:03.118017388 +0000 UTC m=+610.907634387" observedRunningTime="2026-04-16 21:08:03.571999888 +0000 UTC m=+611.361616893" watchObservedRunningTime="2026-04-16 21:08:03.573423326 +0000 UTC m=+611.363040332" Apr 16 21:08:09.532143 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:09.532059 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7cb589bc-6zskn" Apr 16 21:08:09.533717 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:09.533696 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-57b457866-fpw7z" Apr 16 21:08:09.609273 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:09.609233 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7cb589bc-6zskn"] Apr 16 21:08:09.609842 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:09.609702 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-7cb589bc-6zskn" podUID="75fef077-3c22-4371-b4e3-5d2735c414e6" containerName="maas-api" containerID="cri-o://ec2e5926def404d08bad2e251c9928e5a5ef827f5219805aaac09e9023c8670d" gracePeriod=30 Apr 16 21:08:09.846297 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:09.846270 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7cb589bc-6zskn" Apr 16 21:08:09.936007 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:09.935966 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/75fef077-3c22-4371-b4e3-5d2735c414e6-maas-api-tls\") pod \"75fef077-3c22-4371-b4e3-5d2735c414e6\" (UID: \"75fef077-3c22-4371-b4e3-5d2735c414e6\") " Apr 16 21:08:09.936180 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:09.936028 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42sjg\" (UniqueName: \"kubernetes.io/projected/75fef077-3c22-4371-b4e3-5d2735c414e6-kube-api-access-42sjg\") pod \"75fef077-3c22-4371-b4e3-5d2735c414e6\" (UID: \"75fef077-3c22-4371-b4e3-5d2735c414e6\") " Apr 16 21:08:09.938275 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:09.938243 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75fef077-3c22-4371-b4e3-5d2735c414e6-kube-api-access-42sjg" (OuterVolumeSpecName: "kube-api-access-42sjg") pod "75fef077-3c22-4371-b4e3-5d2735c414e6" (UID: "75fef077-3c22-4371-b4e3-5d2735c414e6"). InnerVolumeSpecName "kube-api-access-42sjg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:08:09.938410 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:09.938279 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75fef077-3c22-4371-b4e3-5d2735c414e6-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "75fef077-3c22-4371-b4e3-5d2735c414e6" (UID: "75fef077-3c22-4371-b4e3-5d2735c414e6"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 21:08:10.037198 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:10.037136 2575 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/75fef077-3c22-4371-b4e3-5d2735c414e6-maas-api-tls\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:08:10.037198 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:10.037190 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-42sjg\" (UniqueName: \"kubernetes.io/projected/75fef077-3c22-4371-b4e3-5d2735c414e6-kube-api-access-42sjg\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:08:10.547854 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:10.547814 2575 generic.go:358] "Generic (PLEG): container finished" podID="75fef077-3c22-4371-b4e3-5d2735c414e6" containerID="ec2e5926def404d08bad2e251c9928e5a5ef827f5219805aaac09e9023c8670d" exitCode=0 Apr 16 21:08:10.548277 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:10.547871 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7cb589bc-6zskn" Apr 16 21:08:10.548277 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:10.547893 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7cb589bc-6zskn" event={"ID":"75fef077-3c22-4371-b4e3-5d2735c414e6","Type":"ContainerDied","Data":"ec2e5926def404d08bad2e251c9928e5a5ef827f5219805aaac09e9023c8670d"} Apr 16 21:08:10.548277 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:10.547933 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7cb589bc-6zskn" event={"ID":"75fef077-3c22-4371-b4e3-5d2735c414e6","Type":"ContainerDied","Data":"16e2190a5f425131b329a94dea9f26e3422be3b98c0d7e71a941ddf03c087022"} Apr 16 21:08:10.548277 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:10.547951 2575 scope.go:117] "RemoveContainer" containerID="ec2e5926def404d08bad2e251c9928e5a5ef827f5219805aaac09e9023c8670d" Apr 16 21:08:10.555769 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:10.555738 2575 scope.go:117] "RemoveContainer" containerID="ec2e5926def404d08bad2e251c9928e5a5ef827f5219805aaac09e9023c8670d" Apr 16 21:08:10.556069 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:08:10.556045 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec2e5926def404d08bad2e251c9928e5a5ef827f5219805aaac09e9023c8670d\": container with ID starting with ec2e5926def404d08bad2e251c9928e5a5ef827f5219805aaac09e9023c8670d not found: ID does not exist" containerID="ec2e5926def404d08bad2e251c9928e5a5ef827f5219805aaac09e9023c8670d" Apr 16 21:08:10.556132 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:10.556078 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2e5926def404d08bad2e251c9928e5a5ef827f5219805aaac09e9023c8670d"} err="failed to get container status \"ec2e5926def404d08bad2e251c9928e5a5ef827f5219805aaac09e9023c8670d\": rpc error: code = NotFound desc = could not find container \"ec2e5926def404d08bad2e251c9928e5a5ef827f5219805aaac09e9023c8670d\": container with ID starting with ec2e5926def404d08bad2e251c9928e5a5ef827f5219805aaac09e9023c8670d not found: ID does not exist" Apr 16 21:08:10.569296 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:10.569263 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7cb589bc-6zskn"] Apr 16 21:08:10.573633 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:10.573605 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-7cb589bc-6zskn"] Apr 16 21:08:10.719398 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:10.719358 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75fef077-3c22-4371-b4e3-5d2735c414e6" path="/var/lib/kubelet/pods/75fef077-3c22-4371-b4e3-5d2735c414e6/volumes" Apr 16 21:08:14.529387 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:14.529348 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6db49c57b6-tsnh5" Apr 16 21:08:14.847161 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:14.847077 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-74bd8dbff7-j766h"] Apr 16 21:08:14.847389 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:14.847377 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75fef077-3c22-4371-b4e3-5d2735c414e6" containerName="maas-api" Apr 16 21:08:14.847428 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:14.847390 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="75fef077-3c22-4371-b4e3-5d2735c414e6" containerName="maas-api" Apr 16 21:08:14.847482 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:14.847468 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="75fef077-3c22-4371-b4e3-5d2735c414e6" containerName="maas-api" Apr 16 21:08:14.851667 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:14.851641 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-74bd8dbff7-j766h" Apr 16 21:08:14.859633 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:14.859610 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-74bd8dbff7-j766h"] Apr 16 21:08:14.977782 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:14.977743 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlljg\" (UniqueName: \"kubernetes.io/projected/32fdc05f-aa5b-4c34-a721-cebdc344f1b8-kube-api-access-vlljg\") pod \"maas-controller-74bd8dbff7-j766h\" (UID: \"32fdc05f-aa5b-4c34-a721-cebdc344f1b8\") " pod="opendatahub/maas-controller-74bd8dbff7-j766h" Apr 16 21:08:15.078953 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:15.078917 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlljg\" (UniqueName: \"kubernetes.io/projected/32fdc05f-aa5b-4c34-a721-cebdc344f1b8-kube-api-access-vlljg\") pod \"maas-controller-74bd8dbff7-j766h\" (UID: \"32fdc05f-aa5b-4c34-a721-cebdc344f1b8\") " pod="opendatahub/maas-controller-74bd8dbff7-j766h" Apr 16 21:08:15.087933 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:15.087903 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlljg\" (UniqueName: \"kubernetes.io/projected/32fdc05f-aa5b-4c34-a721-cebdc344f1b8-kube-api-access-vlljg\") pod \"maas-controller-74bd8dbff7-j766h\" (UID: \"32fdc05f-aa5b-4c34-a721-cebdc344f1b8\") " pod="opendatahub/maas-controller-74bd8dbff7-j766h" Apr 16 21:08:15.162616 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:15.162593 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-74bd8dbff7-j766h" Apr 16 21:08:15.486745 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:15.486633 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-74bd8dbff7-j766h"] Apr 16 21:08:15.488928 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:08:15.488895 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32fdc05f_aa5b_4c34_a721_cebdc344f1b8.slice/crio-ba06e8d65cb75d61ac90a131ff75eda1624e5e858f754745e18237c316ae14bd WatchSource:0}: Error finding container ba06e8d65cb75d61ac90a131ff75eda1624e5e858f754745e18237c316ae14bd: Status 404 returned error can't find the container with id ba06e8d65cb75d61ac90a131ff75eda1624e5e858f754745e18237c316ae14bd Apr 16 21:08:15.566493 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:15.566454 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-74bd8dbff7-j766h" event={"ID":"32fdc05f-aa5b-4c34-a721-cebdc344f1b8","Type":"ContainerStarted","Data":"ba06e8d65cb75d61ac90a131ff75eda1624e5e858f754745e18237c316ae14bd"} Apr 16 21:08:16.571145 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:16.571111 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-74bd8dbff7-j766h" event={"ID":"32fdc05f-aa5b-4c34-a721-cebdc344f1b8","Type":"ContainerStarted","Data":"021dfae3d217bcd5e520b90c4e69465bf6577f9ac9f3b4ec3f31dd5a6788460c"} Apr 16 21:08:16.571544 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:16.571228 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-74bd8dbff7-j766h" Apr 16 21:08:16.588805 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:16.588750 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-74bd8dbff7-j766h" podStartSLOduration=2.251522674 podStartE2EDuration="2.58873326s" podCreationTimestamp="2026-04-16 21:08:14 +0000 UTC" firstStartedPulling="2026-04-16 21:08:15.490366055 +0000 UTC m=+623.279983044" lastFinishedPulling="2026-04-16 21:08:15.827576647 +0000 UTC m=+623.617193630" observedRunningTime="2026-04-16 21:08:16.58735113 +0000 UTC m=+624.376968146" watchObservedRunningTime="2026-04-16 21:08:16.58873326 +0000 UTC m=+624.378350265" Apr 16 21:08:27.576539 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:27.576506 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-74bd8dbff7-j766h" Apr 16 21:08:27.622596 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:27.622565 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6db49c57b6-tsnh5"] Apr 16 21:08:27.622835 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:27.622807 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6db49c57b6-tsnh5" podUID="94623ad1-b32e-4749-b015-a444fa49fd32" containerName="manager" containerID="cri-o://2a1eb1e867dfc8c5853a42ce7ac85d8f0f2b194800e5edeb008e14e900d5ed7a" gracePeriod=10 Apr 16 21:08:27.866879 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:27.866856 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6db49c57b6-tsnh5" Apr 16 21:08:27.986792 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:27.986760 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2jnc\" (UniqueName: \"kubernetes.io/projected/94623ad1-b32e-4749-b015-a444fa49fd32-kube-api-access-z2jnc\") pod \"94623ad1-b32e-4749-b015-a444fa49fd32\" (UID: \"94623ad1-b32e-4749-b015-a444fa49fd32\") " Apr 16 21:08:27.988823 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:27.988785 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94623ad1-b32e-4749-b015-a444fa49fd32-kube-api-access-z2jnc" (OuterVolumeSpecName: "kube-api-access-z2jnc") pod "94623ad1-b32e-4749-b015-a444fa49fd32" (UID: "94623ad1-b32e-4749-b015-a444fa49fd32"). InnerVolumeSpecName "kube-api-access-z2jnc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:08:28.087824 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:28.087784 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z2jnc\" (UniqueName: \"kubernetes.io/projected/94623ad1-b32e-4749-b015-a444fa49fd32-kube-api-access-z2jnc\") on node \"ip-10-0-141-171.ec2.internal\" DevicePath \"\"" Apr 16 21:08:28.611330 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:28.611287 2575 generic.go:358] "Generic (PLEG): container finished" podID="94623ad1-b32e-4749-b015-a444fa49fd32" containerID="2a1eb1e867dfc8c5853a42ce7ac85d8f0f2b194800e5edeb008e14e900d5ed7a" exitCode=0 Apr 16 21:08:28.611786 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:28.611355 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6db49c57b6-tsnh5" Apr 16 21:08:28.611786 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:28.611366 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6db49c57b6-tsnh5" event={"ID":"94623ad1-b32e-4749-b015-a444fa49fd32","Type":"ContainerDied","Data":"2a1eb1e867dfc8c5853a42ce7ac85d8f0f2b194800e5edeb008e14e900d5ed7a"} Apr 16 21:08:28.611786 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:28.611405 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6db49c57b6-tsnh5" event={"ID":"94623ad1-b32e-4749-b015-a444fa49fd32","Type":"ContainerDied","Data":"18033467c8f97c954b5777570294b634791acf072dbf87ed15884cec2cd092b5"} Apr 16 21:08:28.611786 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:28.611417 2575 scope.go:117] "RemoveContainer" containerID="2a1eb1e867dfc8c5853a42ce7ac85d8f0f2b194800e5edeb008e14e900d5ed7a" Apr 16 21:08:28.619486 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:28.619462 2575 scope.go:117] "RemoveContainer" containerID="2a1eb1e867dfc8c5853a42ce7ac85d8f0f2b194800e5edeb008e14e900d5ed7a" Apr 16 21:08:28.619755 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:08:28.619733 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a1eb1e867dfc8c5853a42ce7ac85d8f0f2b194800e5edeb008e14e900d5ed7a\": container with ID starting with 2a1eb1e867dfc8c5853a42ce7ac85d8f0f2b194800e5edeb008e14e900d5ed7a not found: ID does not exist" containerID="2a1eb1e867dfc8c5853a42ce7ac85d8f0f2b194800e5edeb008e14e900d5ed7a" Apr 16 21:08:28.619829 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:28.619763 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a1eb1e867dfc8c5853a42ce7ac85d8f0f2b194800e5edeb008e14e900d5ed7a"} err="failed to get container status \"2a1eb1e867dfc8c5853a42ce7ac85d8f0f2b194800e5edeb008e14e900d5ed7a\": rpc error: code = NotFound desc = could not find container \"2a1eb1e867dfc8c5853a42ce7ac85d8f0f2b194800e5edeb008e14e900d5ed7a\": container with ID starting with 2a1eb1e867dfc8c5853a42ce7ac85d8f0f2b194800e5edeb008e14e900d5ed7a not found: ID does not exist" Apr 16 21:08:28.636299 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:28.636270 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6db49c57b6-tsnh5"] Apr 16 21:08:28.638023 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:28.638001 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6db49c57b6-tsnh5"] Apr 16 21:08:28.718831 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:28.718798 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94623ad1-b32e-4749-b015-a444fa49fd32" path="/var/lib/kubelet/pods/94623ad1-b32e-4749-b015-a444fa49fd32/volumes" Apr 16 21:08:41.872711 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:41.872672 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl"] Apr 16 21:08:41.875004 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:41.872989 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94623ad1-b32e-4749-b015-a444fa49fd32" containerName="manager" Apr 16 21:08:41.875004 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:41.873001 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="94623ad1-b32e-4749-b015-a444fa49fd32" containerName="manager" Apr 16 21:08:41.875004 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:41.873056 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="94623ad1-b32e-4749-b015-a444fa49fd32" containerName="manager" Apr 16 21:08:41.875884 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:41.875867 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:41.878666 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:41.878636 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 21:08:41.880018 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:41.879981 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-6pmmg\"" Apr 16 21:08:41.880018 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:41.880001 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 21:08:41.880188 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:41.880014 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 16 21:08:41.884702 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:41.884682 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl"] Apr 16 21:08:42.005486 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.005434 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5510d8e-da97-48ee-bd55-c94946e47e95-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.005647 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.005498 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5510d8e-da97-48ee-bd55-c94946e47e95-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.005647 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.005540 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5510d8e-da97-48ee-bd55-c94946e47e95-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.005647 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.005599 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c5510d8e-da97-48ee-bd55-c94946e47e95-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.005647 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.005623 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c5510d8e-da97-48ee-bd55-c94946e47e95-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.005774 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.005682 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddgbv\" (UniqueName: \"kubernetes.io/projected/c5510d8e-da97-48ee-bd55-c94946e47e95-kube-api-access-ddgbv\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.106107 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.106069 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5510d8e-da97-48ee-bd55-c94946e47e95-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.106107 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.106115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5510d8e-da97-48ee-bd55-c94946e47e95-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.106310 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.106242 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5510d8e-da97-48ee-bd55-c94946e47e95-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.106310 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.106307 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c5510d8e-da97-48ee-bd55-c94946e47e95-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.106379 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.106330 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c5510d8e-da97-48ee-bd55-c94946e47e95-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.106379 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.106351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddgbv\" (UniqueName: \"kubernetes.io/projected/c5510d8e-da97-48ee-bd55-c94946e47e95-kube-api-access-ddgbv\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.106530 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.106508 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5510d8e-da97-48ee-bd55-c94946e47e95-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.106646 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.106627 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c5510d8e-da97-48ee-bd55-c94946e47e95-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.106727 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.106705 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c5510d8e-da97-48ee-bd55-c94946e47e95-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.108470 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.108434 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c5510d8e-da97-48ee-bd55-c94946e47e95-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.108911 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.108891 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c5510d8e-da97-48ee-bd55-c94946e47e95-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.114906 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.114879 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddgbv\" (UniqueName: \"kubernetes.io/projected/c5510d8e-da97-48ee-bd55-c94946e47e95-kube-api-access-ddgbv\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-46gxl\" (UID: \"c5510d8e-da97-48ee-bd55-c94946e47e95\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.185894 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.185866 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:08:42.317262 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.317235 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl"] Apr 16 21:08:42.319716 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:08:42.319678 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5510d8e_da97_48ee_bd55_c94946e47e95.slice/crio-77c610df0447d5f86201c1e9812ce411a0f329887be9b3194b5ad6055c9cf045 WatchSource:0}: Error finding container 77c610df0447d5f86201c1e9812ce411a0f329887be9b3194b5ad6055c9cf045: Status 404 returned error can't find the container with id 77c610df0447d5f86201c1e9812ce411a0f329887be9b3194b5ad6055c9cf045 Apr 16 21:08:42.659527 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:42.659491 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" event={"ID":"c5510d8e-da97-48ee-bd55-c94946e47e95","Type":"ContainerStarted","Data":"77c610df0447d5f86201c1e9812ce411a0f329887be9b3194b5ad6055c9cf045"} Apr 16 21:08:49.687753 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:49.687706 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" event={"ID":"c5510d8e-da97-48ee-bd55-c94946e47e95","Type":"ContainerStarted","Data":"a8c4ef213e57725e5eee17f07574e9a9f91176bb90dec0c82e13d6a824cde41c"} Apr 16 21:08:54.285235 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.285202 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m"] Apr 16 21:08:54.289902 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.289881 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.293157 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.292929 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 16 21:08:54.299521 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.299495 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m"] Apr 16 21:08:54.316429 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.316367 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86e8c470-ee33-4f99-9c87-a656aaf3db20-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.316429 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.316405 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86e8c470-ee33-4f99-9c87-a656aaf3db20-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.316429 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.316462 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs96x\" (UniqueName: \"kubernetes.io/projected/86e8c470-ee33-4f99-9c87-a656aaf3db20-kube-api-access-bs96x\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.317921 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.316618 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86e8c470-ee33-4f99-9c87-a656aaf3db20-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.317921 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.316766 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86e8c470-ee33-4f99-9c87-a656aaf3db20-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.317921 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.316932 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86e8c470-ee33-4f99-9c87-a656aaf3db20-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.417486 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.417402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86e8c470-ee33-4f99-9c87-a656aaf3db20-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.417737 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.417490 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bs96x\" (UniqueName: \"kubernetes.io/projected/86e8c470-ee33-4f99-9c87-a656aaf3db20-kube-api-access-bs96x\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.417737 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.417535 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86e8c470-ee33-4f99-9c87-a656aaf3db20-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.417737 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.417574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86e8c470-ee33-4f99-9c87-a656aaf3db20-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.417737 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.417644 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86e8c470-ee33-4f99-9c87-a656aaf3db20-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.417737 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.417688 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86e8c470-ee33-4f99-9c87-a656aaf3db20-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.418152 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.418000 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86e8c470-ee33-4f99-9c87-a656aaf3db20-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.418152 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.418020 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86e8c470-ee33-4f99-9c87-a656aaf3db20-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.418152 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.418089 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86e8c470-ee33-4f99-9c87-a656aaf3db20-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.420349 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.420322 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86e8c470-ee33-4f99-9c87-a656aaf3db20-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.420500 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.420414 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86e8c470-ee33-4f99-9c87-a656aaf3db20-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.435892 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.435862 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs96x\" (UniqueName: \"kubernetes.io/projected/86e8c470-ee33-4f99-9c87-a656aaf3db20-kube-api-access-bs96x\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m\" (UID: \"86e8c470-ee33-4f99-9c87-a656aaf3db20\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.601900 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.601807 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:08:54.760592 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:54.760555 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m"] Apr 16 21:08:54.767258 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:08:54.767219 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86e8c470_ee33_4f99_9c87_a656aaf3db20.slice/crio-1d7843380c5cfee72d4f2f2f76b4fef1686e8f2812550c94e23db98e0c2cf6a9 WatchSource:0}: Error finding container 1d7843380c5cfee72d4f2f2f76b4fef1686e8f2812550c94e23db98e0c2cf6a9: Status 404 returned error can't find the container with id 1d7843380c5cfee72d4f2f2f76b4fef1686e8f2812550c94e23db98e0c2cf6a9 Apr 16 21:08:55.709605 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:55.709562 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" event={"ID":"86e8c470-ee33-4f99-9c87-a656aaf3db20","Type":"ContainerStarted","Data":"c35937e1267dddcb74634e7b3a9f0fa1a182b68b82838c0a1e35fc4e0faad923"} Apr 16 21:08:55.709605 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:55.709605 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" event={"ID":"86e8c470-ee33-4f99-9c87-a656aaf3db20","Type":"ContainerStarted","Data":"1d7843380c5cfee72d4f2f2f76b4fef1686e8f2812550c94e23db98e0c2cf6a9"} Apr 16 21:08:58.721592 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:58.721505 2575 generic.go:358] "Generic (PLEG): container finished" podID="c5510d8e-da97-48ee-bd55-c94946e47e95" containerID="a8c4ef213e57725e5eee17f07574e9a9f91176bb90dec0c82e13d6a824cde41c" exitCode=0 Apr 16 21:08:58.721592 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:08:58.721577 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" event={"ID":"c5510d8e-da97-48ee-bd55-c94946e47e95","Type":"ContainerDied","Data":"a8c4ef213e57725e5eee17f07574e9a9f91176bb90dec0c82e13d6a824cde41c"} Apr 16 21:09:02.472818 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.472777 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6"] Apr 16 21:09:02.489921 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.489886 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6"] Apr 16 21:09:02.490108 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.490040 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.494322 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.493792 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 16 21:09:02.593989 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.593941 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4f69d66-fb99-407e-a07f-85ffc9f80931-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.594183 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.594029 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4f69d66-fb99-407e-a07f-85ffc9f80931-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.594183 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.594085 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4f69d66-fb99-407e-a07f-85ffc9f80931-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.594284 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.594173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4f69d66-fb99-407e-a07f-85ffc9f80931-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.594284 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.594205 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4f69d66-fb99-407e-a07f-85ffc9f80931-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.594284 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.594241 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2vtw\" (UniqueName: \"kubernetes.io/projected/c4f69d66-fb99-407e-a07f-85ffc9f80931-kube-api-access-v2vtw\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.695008 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.694961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4f69d66-fb99-407e-a07f-85ffc9f80931-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.695193 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.695028 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4f69d66-fb99-407e-a07f-85ffc9f80931-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.695193 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.695076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4f69d66-fb99-407e-a07f-85ffc9f80931-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.695193 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.695102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4f69d66-fb99-407e-a07f-85ffc9f80931-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.695193 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.695140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2vtw\" (UniqueName: \"kubernetes.io/projected/c4f69d66-fb99-407e-a07f-85ffc9f80931-kube-api-access-v2vtw\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.695405 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.695201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4f69d66-fb99-407e-a07f-85ffc9f80931-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.695405 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.695367 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4f69d66-fb99-407e-a07f-85ffc9f80931-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.695548 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.695519 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4f69d66-fb99-407e-a07f-85ffc9f80931-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.695606 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.695545 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4f69d66-fb99-407e-a07f-85ffc9f80931-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.698150 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.698119 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4f69d66-fb99-407e-a07f-85ffc9f80931-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.698817 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.698338 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4f69d66-fb99-407e-a07f-85ffc9f80931-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.708024 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.707964 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2vtw\" (UniqueName: \"kubernetes.io/projected/c4f69d66-fb99-407e-a07f-85ffc9f80931-kube-api-access-v2vtw\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6\" (UID: \"c4f69d66-fb99-407e-a07f-85ffc9f80931\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:02.748278 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.748136 2575 generic.go:358] "Generic (PLEG): container finished" podID="86e8c470-ee33-4f99-9c87-a656aaf3db20" containerID="c35937e1267dddcb74634e7b3a9f0fa1a182b68b82838c0a1e35fc4e0faad923" exitCode=0 Apr 16 21:09:02.748278 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.748175 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" event={"ID":"86e8c470-ee33-4f99-9c87-a656aaf3db20","Type":"ContainerDied","Data":"c35937e1267dddcb74634e7b3a9f0fa1a182b68b82838c0a1e35fc4e0faad923"} Apr 16 21:09:02.803366 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:02.803256 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:03.942188 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:03.942163 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6"] Apr 16 21:09:03.944730 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:09:03.944682 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4f69d66_fb99_407e_a07f_85ffc9f80931.slice/crio-179dc6ba2c448366a7a188aeacb5bb6f9e244e163344dad6fc86d0de49f62985 WatchSource:0}: Error finding container 179dc6ba2c448366a7a188aeacb5bb6f9e244e163344dad6fc86d0de49f62985: Status 404 returned error can't find the container with id 179dc6ba2c448366a7a188aeacb5bb6f9e244e163344dad6fc86d0de49f62985 Apr 16 21:09:03.946863 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:03.946847 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:09:04.758413 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:04.758368 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" event={"ID":"c5510d8e-da97-48ee-bd55-c94946e47e95","Type":"ContainerStarted","Data":"7513c6537df291c0ea501011e0b777a1a299408ce2f564b51fa8b3959bcf096b"} Apr 16 21:09:04.758657 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:04.758601 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:09:04.760088 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:04.760058 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" event={"ID":"86e8c470-ee33-4f99-9c87-a656aaf3db20","Type":"ContainerStarted","Data":"30ad08ab11b0832d7caa1d0ae37ebc163c82b02ad171f9a9ddad009f84b72852"} Apr 16 21:09:04.760276 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:04.760261 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:09:04.761493 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:04.761465 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" event={"ID":"c4f69d66-fb99-407e-a07f-85ffc9f80931","Type":"ContainerStarted","Data":"4159af2b125b1fe586c96701dfa73e4f0014beaff0464c9fd6346a3da22d3935"} Apr 16 21:09:04.761592 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:04.761503 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" event={"ID":"c4f69d66-fb99-407e-a07f-85ffc9f80931","Type":"ContainerStarted","Data":"179dc6ba2c448366a7a188aeacb5bb6f9e244e163344dad6fc86d0de49f62985"} Apr 16 21:09:04.778733 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:04.778685 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" podStartSLOduration=2.2270025 podStartE2EDuration="23.778670604s" podCreationTimestamp="2026-04-16 21:08:41 +0000 UTC" firstStartedPulling="2026-04-16 21:08:42.321918966 +0000 UTC m=+650.111535954" lastFinishedPulling="2026-04-16 21:09:03.873587072 +0000 UTC m=+671.663204058" observedRunningTime="2026-04-16 21:09:04.777382393 +0000 UTC m=+672.566999396" watchObservedRunningTime="2026-04-16 21:09:04.778670604 +0000 UTC m=+672.568287610" Apr 16 21:09:04.815854 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:04.815795 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" podStartSLOduration=9.698135367999999 podStartE2EDuration="10.815779228s" podCreationTimestamp="2026-04-16 21:08:54 +0000 UTC" firstStartedPulling="2026-04-16 21:09:02.749021163 +0000 UTC m=+670.538638146" lastFinishedPulling="2026-04-16 21:09:03.866665009 +0000 UTC m=+671.656282006" observedRunningTime="2026-04-16 21:09:04.815163883 +0000 UTC m=+672.604780891" watchObservedRunningTime="2026-04-16 21:09:04.815779228 +0000 UTC m=+672.605396233" Apr 16 21:09:09.778585 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:09.778549 2575 generic.go:358] "Generic (PLEG): container finished" podID="c4f69d66-fb99-407e-a07f-85ffc9f80931" containerID="4159af2b125b1fe586c96701dfa73e4f0014beaff0464c9fd6346a3da22d3935" exitCode=0 Apr 16 21:09:09.778984 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:09.778602 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" event={"ID":"c4f69d66-fb99-407e-a07f-85ffc9f80931","Type":"ContainerDied","Data":"4159af2b125b1fe586c96701dfa73e4f0014beaff0464c9fd6346a3da22d3935"} Apr 16 21:09:10.784083 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:10.784045 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" event={"ID":"c4f69d66-fb99-407e-a07f-85ffc9f80931","Type":"ContainerStarted","Data":"87533b54f4b3cc419a7dcb5da32acc44ad20fede4c747350e7b0584a3f361283"} Apr 16 21:09:10.784528 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:10.784280 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:10.804650 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:10.804602 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" podStartSLOduration=8.383402825 podStartE2EDuration="8.804586542s" podCreationTimestamp="2026-04-16 21:09:02 +0000 UTC" firstStartedPulling="2026-04-16 21:09:09.779179229 +0000 UTC m=+677.568796213" lastFinishedPulling="2026-04-16 21:09:10.200362947 +0000 UTC m=+677.989979930" observedRunningTime="2026-04-16 21:09:10.802627171 +0000 UTC m=+678.592244179" watchObservedRunningTime="2026-04-16 21:09:10.804586542 +0000 UTC m=+678.594203597" Apr 16 21:09:15.777674 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:15.777641 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-46gxl" Apr 16 21:09:15.778398 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:15.778371 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m" Apr 16 21:09:21.803204 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:21.803175 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6" Apr 16 21:09:34.978262 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:34.978217 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h"] Apr 16 21:09:35.036843 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.036813 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h"] Apr 16 21:09:35.037009 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.036946 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.039733 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.039708 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 16 21:09:35.180014 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.179977 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgdgz\" (UniqueName: \"kubernetes.io/projected/c9bacd69-a33a-4078-8d74-02218519da22-kube-api-access-jgdgz\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.180175 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.180021 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9bacd69-a33a-4078-8d74-02218519da22-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.180175 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.180057 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bacd69-a33a-4078-8d74-02218519da22-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.180175 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.180087 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c9bacd69-a33a-4078-8d74-02218519da22-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.180175 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.180102 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c9bacd69-a33a-4078-8d74-02218519da22-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.180175 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.180127 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c9bacd69-a33a-4078-8d74-02218519da22-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.281220 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.281138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgdgz\" (UniqueName: \"kubernetes.io/projected/c9bacd69-a33a-4078-8d74-02218519da22-kube-api-access-jgdgz\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.281220 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.281179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9bacd69-a33a-4078-8d74-02218519da22-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.281477 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.281222 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bacd69-a33a-4078-8d74-02218519da22-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.281477 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.281386 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c9bacd69-a33a-4078-8d74-02218519da22-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.281477 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.281419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c9bacd69-a33a-4078-8d74-02218519da22-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.281612 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.281516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c9bacd69-a33a-4078-8d74-02218519da22-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.281612 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.281578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9bacd69-a33a-4078-8d74-02218519da22-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.281797 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.281771 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c9bacd69-a33a-4078-8d74-02218519da22-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.281855 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.281806 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c9bacd69-a33a-4078-8d74-02218519da22-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.283482 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.283458 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c9bacd69-a33a-4078-8d74-02218519da22-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.283619 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.283602 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bacd69-a33a-4078-8d74-02218519da22-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.289733 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.289706 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgdgz\" (UniqueName: \"kubernetes.io/projected/c9bacd69-a33a-4078-8d74-02218519da22-kube-api-access-jgdgz\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h\" (UID: \"c9bacd69-a33a-4078-8d74-02218519da22\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.349384 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.349351 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:35.481079 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.481018 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h"] Apr 16 21:09:35.483503 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:09:35.483477 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9bacd69_a33a_4078_8d74_02218519da22.slice/crio-dea4e705d962d1f2b43c100d5829834569aff430429a03ec8899f8274e227132 WatchSource:0}: Error finding container dea4e705d962d1f2b43c100d5829834569aff430429a03ec8899f8274e227132: Status 404 returned error can't find the container with id dea4e705d962d1f2b43c100d5829834569aff430429a03ec8899f8274e227132 Apr 16 21:09:35.881244 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.881149 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" event={"ID":"c9bacd69-a33a-4078-8d74-02218519da22","Type":"ContainerStarted","Data":"1fd0bdfc6ff75fd90cf45a2109a353ff48f86d0db14c99dde3d53bf64b11933e"} Apr 16 21:09:35.881244 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:35.881194 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" event={"ID":"c9bacd69-a33a-4078-8d74-02218519da22","Type":"ContainerStarted","Data":"dea4e705d962d1f2b43c100d5829834569aff430429a03ec8899f8274e227132"} Apr 16 21:09:37.268638 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.268607 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt"] Apr 16 21:09:37.270821 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.270802 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.274284 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.274261 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 16 21:09:37.286276 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.286253 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt"] Apr 16 21:09:37.400406 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.400369 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a89d82cf-358d-4feb-8284-180c55c7c8f9-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.400406 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.400416 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a89d82cf-358d-4feb-8284-180c55c7c8f9-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.400667 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.400547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a89d82cf-358d-4feb-8284-180c55c7c8f9-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.400667 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.400587 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjcwk\" (UniqueName: \"kubernetes.io/projected/a89d82cf-358d-4feb-8284-180c55c7c8f9-kube-api-access-qjcwk\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.400667 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.400634 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a89d82cf-358d-4feb-8284-180c55c7c8f9-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.400667 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.400664 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a89d82cf-358d-4feb-8284-180c55c7c8f9-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.501417 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.501374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjcwk\" (UniqueName: \"kubernetes.io/projected/a89d82cf-358d-4feb-8284-180c55c7c8f9-kube-api-access-qjcwk\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.501625 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.501465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a89d82cf-358d-4feb-8284-180c55c7c8f9-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.501625 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.501511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a89d82cf-358d-4feb-8284-180c55c7c8f9-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.501625 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.501559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a89d82cf-358d-4feb-8284-180c55c7c8f9-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.501625 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.501586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a89d82cf-358d-4feb-8284-180c55c7c8f9-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.501838 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.501738 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a89d82cf-358d-4feb-8284-180c55c7c8f9-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.501938 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.501910 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a89d82cf-358d-4feb-8284-180c55c7c8f9-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.502097 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.502073 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a89d82cf-358d-4feb-8284-180c55c7c8f9-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.502206 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.502039 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a89d82cf-358d-4feb-8284-180c55c7c8f9-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.503917 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.503897 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a89d82cf-358d-4feb-8284-180c55c7c8f9-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.504165 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.504140 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a89d82cf-358d-4feb-8284-180c55c7c8f9-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.512471 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.512425 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjcwk\" (UniqueName: \"kubernetes.io/projected/a89d82cf-358d-4feb-8284-180c55c7c8f9-kube-api-access-qjcwk\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-hprdt\" (UID: \"a89d82cf-358d-4feb-8284-180c55c7c8f9\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.581994 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.581903 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:37.717564 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.717532 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt"] Apr 16 21:09:37.720343 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:09:37.720319 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda89d82cf_358d_4feb_8284_180c55c7c8f9.slice/crio-b4dd6caddbafd0ffbc60e9a88382998c05eccfcb648508cf26b6ccde8b8f5f5e WatchSource:0}: Error finding container b4dd6caddbafd0ffbc60e9a88382998c05eccfcb648508cf26b6ccde8b8f5f5e: Status 404 returned error can't find the container with id b4dd6caddbafd0ffbc60e9a88382998c05eccfcb648508cf26b6ccde8b8f5f5e Apr 16 21:09:37.889832 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.889738 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" event={"ID":"a89d82cf-358d-4feb-8284-180c55c7c8f9","Type":"ContainerStarted","Data":"172f8d44a624898016824c0aaa63996916f678f96963af7f3a9d3ca6c14d9e88"} Apr 16 21:09:37.889832 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:37.889783 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" event={"ID":"a89d82cf-358d-4feb-8284-180c55c7c8f9","Type":"ContainerStarted","Data":"b4dd6caddbafd0ffbc60e9a88382998c05eccfcb648508cf26b6ccde8b8f5f5e"} Apr 16 21:09:41.904398 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:41.904360 2575 generic.go:358] "Generic (PLEG): container finished" podID="c9bacd69-a33a-4078-8d74-02218519da22" containerID="1fd0bdfc6ff75fd90cf45a2109a353ff48f86d0db14c99dde3d53bf64b11933e" exitCode=0 Apr 16 21:09:41.904831 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:41.904467 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" event={"ID":"c9bacd69-a33a-4078-8d74-02218519da22","Type":"ContainerDied","Data":"1fd0bdfc6ff75fd90cf45a2109a353ff48f86d0db14c99dde3d53bf64b11933e"} Apr 16 21:09:42.909834 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:42.909798 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" event={"ID":"c9bacd69-a33a-4078-8d74-02218519da22","Type":"ContainerStarted","Data":"3ce86a33d3a72d8077c2788c3332038265c4c3e968e8b2fc2a609dab99c1c872"} Apr 16 21:09:42.910195 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:42.910019 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:42.929414 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:42.929355 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" podStartSLOduration=8.735181065999999 podStartE2EDuration="8.929336276s" podCreationTimestamp="2026-04-16 21:09:34 +0000 UTC" firstStartedPulling="2026-04-16 21:09:41.905336474 +0000 UTC m=+709.694953474" lastFinishedPulling="2026-04-16 21:09:42.099491697 +0000 UTC m=+709.889108684" observedRunningTime="2026-04-16 21:09:42.928349638 +0000 UTC m=+710.717966646" watchObservedRunningTime="2026-04-16 21:09:42.929336276 +0000 UTC m=+710.718953282" Apr 16 21:09:43.915251 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:43.915210 2575 generic.go:358] "Generic (PLEG): container finished" podID="a89d82cf-358d-4feb-8284-180c55c7c8f9" containerID="172f8d44a624898016824c0aaa63996916f678f96963af7f3a9d3ca6c14d9e88" exitCode=0 Apr 16 21:09:43.915748 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:43.915288 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" event={"ID":"a89d82cf-358d-4feb-8284-180c55c7c8f9","Type":"ContainerDied","Data":"172f8d44a624898016824c0aaa63996916f678f96963af7f3a9d3ca6c14d9e88"} Apr 16 21:09:44.121548 ip-10-0-141-171 kubenswrapper[2575]: E0416 21:09:44.121514 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda89d82cf_358d_4feb_8284_180c55c7c8f9.slice/crio-b7858b059d290cd5a500a6a31e15e5027c24c4d4dc84b39f00e23929d0781b1e.scope\": RecentStats: unable to find data in memory cache]" Apr 16 21:09:44.921526 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:44.921486 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" event={"ID":"a89d82cf-358d-4feb-8284-180c55c7c8f9","Type":"ContainerStarted","Data":"b7858b059d290cd5a500a6a31e15e5027c24c4d4dc84b39f00e23929d0781b1e"} Apr 16 21:09:44.921948 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:44.921898 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:09:44.946019 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:44.945975 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" podStartSLOduration=7.782730191 podStartE2EDuration="7.945961774s" podCreationTimestamp="2026-04-16 21:09:37 +0000 UTC" firstStartedPulling="2026-04-16 21:09:43.916124426 +0000 UTC m=+711.705741412" lastFinishedPulling="2026-04-16 21:09:44.079356011 +0000 UTC m=+711.868972995" observedRunningTime="2026-04-16 21:09:44.944024526 +0000 UTC m=+712.733641580" watchObservedRunningTime="2026-04-16 21:09:44.945961774 +0000 UTC m=+712.735578778" Apr 16 21:09:53.931932 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:53.931886 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h" Apr 16 21:09:55.940434 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:09:55.940400 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-hprdt" Apr 16 21:12:52.664208 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:12:52.664175 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzbn8_ffb608ad-9008-418c-a258-d80cf876c140/ovn-acl-logging/0.log" Apr 16 21:12:52.665367 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:12:52.665347 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzbn8_ffb608ad-9008-418c-a258-d80cf876c140/ovn-acl-logging/0.log" Apr 16 21:17:52.685975 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:17:52.685947 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzbn8_ffb608ad-9008-418c-a258-d80cf876c140/ovn-acl-logging/0.log" Apr 16 21:17:52.687895 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:17:52.687872 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzbn8_ffb608ad-9008-418c-a258-d80cf876c140/ovn-acl-logging/0.log" Apr 16 21:22:52.711241 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:22:52.711215 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzbn8_ffb608ad-9008-418c-a258-d80cf876c140/ovn-acl-logging/0.log" Apr 16 21:22:52.714742 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:22:52.714719 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzbn8_ffb608ad-9008-418c-a258-d80cf876c140/ovn-acl-logging/0.log" Apr 16 21:27:52.735989 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:27:52.735959 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzbn8_ffb608ad-9008-418c-a258-d80cf876c140/ovn-acl-logging/0.log" Apr 16 21:27:52.740585 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:27:52.740564 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tzbn8_ffb608ad-9008-418c-a258-d80cf876c140/ovn-acl-logging/0.log" Apr 16 21:28:10.459581 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:10.459541 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-5khsz_e6ef7701-a721-4236-977d-d94cb403e9b2/manager/0.log" Apr 16 21:28:10.583912 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:10.583883 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-57b457866-fpw7z_8c56b667-b3b8-4540-8d5f-2f00dc11eb59/maas-api/0.log" Apr 16 21:28:10.710420 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:10.710341 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-74bd8dbff7-j766h_32fdc05f-aa5b-4c34-a721-cebdc344f1b8/manager/0.log" Apr 16 21:28:10.836533 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:10.836498 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-9mpb6_900eb218-7e34-4aad-9e70-a8cb276b5b9f/manager/2.log" Apr 16 21:28:10.960159 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:10.960125 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f94c666bb-9phcp_0d2b81e9-f4cc-4f8a-ac8f-86e433868873/manager/0.log" Apr 16 21:28:11.316374 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:11.316341 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-tpg9j_5a89e38d-2beb-446b-a9f7-ddb8c3fb0c32/postgres/0.log" Apr 16 21:28:13.215299 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:13.215269 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-z8frn_e15f0b1a-9607-4abd-a1bf-04eb18545b11/manager/0.log" Apr 16 21:28:13.914200 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:13.914165 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-vv8xs_34b44077-1c57-46bf-a8ce-9eb591c5f352/discovery/0.log" Apr 16 21:28:14.138020 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:14.137983 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-56b49765cd-cszkj_4ed78395-b9d7-4a62-b308-e860cc2c8ce5/kube-auth-proxy/0.log" Apr 16 21:28:14.710621 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:14.710590 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h_c9bacd69-a33a-4078-8d74-02218519da22/main/0.log" Apr 16 21:28:14.718481 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:14.718455 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-sfq2h_c9bacd69-a33a-4078-8d74-02218519da22/storage-initializer/0.log" Apr 16 21:28:14.829922 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:14.829897 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-hprdt_a89d82cf-358d-4feb-8284-180c55c7c8f9/main/0.log" Apr 16 21:28:14.837727 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:14.837698 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-hprdt_a89d82cf-358d-4feb-8284-180c55c7c8f9/storage-initializer/0.log" Apr 16 21:28:14.949643 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:14.949615 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m_86e8c470-ee33-4f99-9c87-a656aaf3db20/storage-initializer/0.log" Apr 16 21:28:14.957759 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:14.957732 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccxsx7m_86e8c470-ee33-4f99-9c87-a656aaf3db20/main/0.log" Apr 16 21:28:15.073153 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:15.073088 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6_c4f69d66-fb99-407e-a07f-85ffc9f80931/main/0.log" Apr 16 21:28:15.080138 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:15.080114 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-c76w6_c4f69d66-fb99-407e-a07f-85ffc9f80931/storage-initializer/0.log" Apr 16 21:28:15.192305 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:15.192275 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-46gxl_c5510d8e-da97-48ee-bd55-c94946e47e95/storage-initializer/0.log" Apr 16 21:28:15.200320 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:15.200289 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-46gxl_c5510d8e-da97-48ee-bd55-c94946e47e95/main/0.log" Apr 16 21:28:22.027408 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:22.027372 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qhtpj_90b5377e-36eb-4780-83e8-96ffc2917ab6/global-pull-secret-syncer/0.log" Apr 16 21:28:22.202229 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:22.202197 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-q97vd_655d4da6-65a0-4fa0-98ef-e68070f32130/konnectivity-agent/0.log" Apr 16 21:28:22.280856 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:22.280781 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-171.ec2.internal_334c8afc5dc3a1fc4c297cab1fdb85d1/haproxy/0.log" Apr 16 21:28:27.024078 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:27.024048 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-z8frn_e15f0b1a-9607-4abd-a1bf-04eb18545b11/manager/0.log" Apr 16 21:28:29.249316 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:29.249286 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h5bnx_e84b920a-4041-4dcf-a3ee-2e9e4187bfe7/node-exporter/0.log" Apr 16 21:28:29.274305 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:29.274281 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h5bnx_e84b920a-4041-4dcf-a3ee-2e9e4187bfe7/kube-rbac-proxy/0.log" Apr 16 21:28:29.300287 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:29.300257 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h5bnx_e84b920a-4041-4dcf-a3ee-2e9e4187bfe7/init-textfile/0.log" Apr 16 21:28:30.700270 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.700235 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb"] Apr 16 21:28:30.703581 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.703558 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:30.706718 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.706696 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-hmfkt\"/\"default-dockercfg-7w55v\"" Apr 16 21:28:30.707986 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.707959 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hmfkt\"/\"openshift-service-ca.crt\"" Apr 16 21:28:30.708077 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.708032 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hmfkt\"/\"kube-root-ca.crt\"" Apr 16 21:28:30.719310 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.719290 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb"] Apr 16 21:28:30.735685 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.735661 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt7p5\" (UniqueName: \"kubernetes.io/projected/c83b85b9-9cf6-433d-aaf0-12d8a370db26-kube-api-access-xt7p5\") pod \"perf-node-gather-daemonset-cx5qb\" (UID: \"c83b85b9-9cf6-433d-aaf0-12d8a370db26\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:30.735793 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.735703 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c83b85b9-9cf6-433d-aaf0-12d8a370db26-proc\") pod \"perf-node-gather-daemonset-cx5qb\" (UID: \"c83b85b9-9cf6-433d-aaf0-12d8a370db26\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:30.735793 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.735758 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c83b85b9-9cf6-433d-aaf0-12d8a370db26-sys\") pod \"perf-node-gather-daemonset-cx5qb\" (UID: \"c83b85b9-9cf6-433d-aaf0-12d8a370db26\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:30.735869 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.735817 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c83b85b9-9cf6-433d-aaf0-12d8a370db26-podres\") pod \"perf-node-gather-daemonset-cx5qb\" (UID: \"c83b85b9-9cf6-433d-aaf0-12d8a370db26\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:30.735903 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.735859 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c83b85b9-9cf6-433d-aaf0-12d8a370db26-lib-modules\") pod \"perf-node-gather-daemonset-cx5qb\" (UID: \"c83b85b9-9cf6-433d-aaf0-12d8a370db26\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:30.836925 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.836885 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c83b85b9-9cf6-433d-aaf0-12d8a370db26-sys\") pod \"perf-node-gather-daemonset-cx5qb\" (UID: \"c83b85b9-9cf6-433d-aaf0-12d8a370db26\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:30.837100 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.836943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c83b85b9-9cf6-433d-aaf0-12d8a370db26-podres\") pod \"perf-node-gather-daemonset-cx5qb\" (UID: \"c83b85b9-9cf6-433d-aaf0-12d8a370db26\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:30.837100 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.836981 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c83b85b9-9cf6-433d-aaf0-12d8a370db26-lib-modules\") pod \"perf-node-gather-daemonset-cx5qb\" (UID: \"c83b85b9-9cf6-433d-aaf0-12d8a370db26\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:30.837100 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.837011 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt7p5\" (UniqueName: \"kubernetes.io/projected/c83b85b9-9cf6-433d-aaf0-12d8a370db26-kube-api-access-xt7p5\") pod \"perf-node-gather-daemonset-cx5qb\" (UID: \"c83b85b9-9cf6-433d-aaf0-12d8a370db26\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:30.837100 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.837020 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c83b85b9-9cf6-433d-aaf0-12d8a370db26-sys\") pod \"perf-node-gather-daemonset-cx5qb\" (UID: \"c83b85b9-9cf6-433d-aaf0-12d8a370db26\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:30.837100 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.837034 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c83b85b9-9cf6-433d-aaf0-12d8a370db26-proc\") pod \"perf-node-gather-daemonset-cx5qb\" (UID: \"c83b85b9-9cf6-433d-aaf0-12d8a370db26\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:30.837270 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.837140 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c83b85b9-9cf6-433d-aaf0-12d8a370db26-lib-modules\") pod \"perf-node-gather-daemonset-cx5qb\" (UID: \"c83b85b9-9cf6-433d-aaf0-12d8a370db26\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:30.837270 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.837140 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c83b85b9-9cf6-433d-aaf0-12d8a370db26-podres\") pod \"perf-node-gather-daemonset-cx5qb\" (UID: \"c83b85b9-9cf6-433d-aaf0-12d8a370db26\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:30.837270 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.837150 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c83b85b9-9cf6-433d-aaf0-12d8a370db26-proc\") pod \"perf-node-gather-daemonset-cx5qb\" (UID: \"c83b85b9-9cf6-433d-aaf0-12d8a370db26\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:30.847446 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:30.847410 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt7p5\" (UniqueName: \"kubernetes.io/projected/c83b85b9-9cf6-433d-aaf0-12d8a370db26-kube-api-access-xt7p5\") pod \"perf-node-gather-daemonset-cx5qb\" (UID: \"c83b85b9-9cf6-433d-aaf0-12d8a370db26\") " pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:31.013340 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:31.013237 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:31.143975 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:31.143945 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb"] Apr 16 21:28:31.145792 ip-10-0-141-171 kubenswrapper[2575]: W0416 21:28:31.145763 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc83b85b9_9cf6_433d_aaf0_12d8a370db26.slice/crio-8293a8b1217a59c71a691ab6d7ba019e82a50f5654b329df01f4410a84f8c89a WatchSource:0}: Error finding container 8293a8b1217a59c71a691ab6d7ba019e82a50f5654b329df01f4410a84f8c89a: Status 404 returned error can't find the container with id 8293a8b1217a59c71a691ab6d7ba019e82a50f5654b329df01f4410a84f8c89a Apr 16 21:28:31.147291 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:31.147274 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:28:31.273471 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:31.273378 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-ck6cd_7852b3f2-db76-4624-a66c-450474aeaa93/networking-console-plugin/0.log" Apr 16 21:28:31.788572 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:31.788533 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" event={"ID":"c83b85b9-9cf6-433d-aaf0-12d8a370db26","Type":"ContainerStarted","Data":"83723f724b3666e3efc85c00644c567ea67cdfc3c3f47d1558dad81d1c10ed89"} Apr 16 21:28:31.788572 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:31.788573 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" event={"ID":"c83b85b9-9cf6-433d-aaf0-12d8a370db26","Type":"ContainerStarted","Data":"8293a8b1217a59c71a691ab6d7ba019e82a50f5654b329df01f4410a84f8c89a"} Apr 16 21:28:31.789046 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:31.788604 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:33.487763 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:33.487725 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-b7bcv_6767bcb0-f122-44f9-b378-3f18e741a065/dns/0.log" Apr 16 21:28:33.511942 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:33.511917 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-b7bcv_6767bcb0-f122-44f9-b378-3f18e741a065/kube-rbac-proxy/0.log" Apr 16 21:28:33.707845 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:33.707795 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tzlx6_92c3499b-a30d-4470-8b98-4e3a3b91de06/dns-node-resolver/0.log" Apr 16 21:28:34.259761 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:34.259709 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n7hjd_63a48313-4beb-4c3e-89bb-bddcb5b76d5a/node-ca/0.log" Apr 16 21:28:35.295576 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:35.295523 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-vv8xs_34b44077-1c57-46bf-a8ce-9eb591c5f352/discovery/0.log" Apr 16 21:28:35.344365 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:35.344332 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-56b49765cd-cszkj_4ed78395-b9d7-4a62-b308-e860cc2c8ce5/kube-auth-proxy/0.log" Apr 16 21:28:35.950052 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:35.950019 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-72rld_97e00ea1-e79f-4a6f-b820-0cafb65f4308/serve-healthcheck-canary/0.log" Apr 16 21:28:36.515170 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:36.515141 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pzkf8_d4e21d83-3fa0-4b03-80a5-9daa14fbf570/kube-rbac-proxy/0.log" Apr 16 21:28:36.537058 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:36.537031 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pzkf8_d4e21d83-3fa0-4b03-80a5-9daa14fbf570/exporter/0.log" Apr 16 21:28:36.561454 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:36.561413 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pzkf8_d4e21d83-3fa0-4b03-80a5-9daa14fbf570/extractor/0.log" Apr 16 21:28:37.802120 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:37.802090 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" Apr 16 21:28:37.820142 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:37.820090 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hmfkt/perf-node-gather-daemonset-cx5qb" podStartSLOduration=7.820073734 podStartE2EDuration="7.820073734s" podCreationTimestamp="2026-04-16 21:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:28:31.815124754 +0000 UTC m=+1839.604741759" watchObservedRunningTime="2026-04-16 21:28:37.820073734 +0000 UTC m=+1845.609690739" Apr 16 21:28:38.565669 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:38.565627 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-5khsz_e6ef7701-a721-4236-977d-d94cb403e9b2/manager/0.log" Apr 16 21:28:38.605184 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:38.605145 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-57b457866-fpw7z_8c56b667-b3b8-4540-8d5f-2f00dc11eb59/maas-api/0.log" Apr 16 21:28:38.660632 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:38.660597 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-74bd8dbff7-j766h_32fdc05f-aa5b-4c34-a721-cebdc344f1b8/manager/0.log" Apr 16 21:28:38.711296 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:38.711270 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-9mpb6_900eb218-7e34-4aad-9e70-a8cb276b5b9f/manager/1.log" Apr 16 21:28:38.732133 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:38.732103 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-9mpb6_900eb218-7e34-4aad-9e70-a8cb276b5b9f/manager/2.log" Apr 16 21:28:38.786094 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:38.786062 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f94c666bb-9phcp_0d2b81e9-f4cc-4f8a-ac8f-86e433868873/manager/0.log" Apr 16 21:28:38.861577 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:38.861496 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-tpg9j_5a89e38d-2beb-446b-a9f7-ddb8c3fb0c32/postgres/0.log" Apr 16 21:28:40.243722 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:40.243687 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-wqbvh_0cf30573-34f1-415d-b0be-ebf5aa3ddca8/openshift-lws-operator/0.log" Apr 16 21:28:45.897297 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:45.897270 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8jcqc_c6128194-41ca-4dbf-a538-c346ec94bd50/kube-multus-additional-cni-plugins/0.log" Apr 16 21:28:45.921132 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:45.921106 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8jcqc_c6128194-41ca-4dbf-a538-c346ec94bd50/egress-router-binary-copy/0.log" Apr 16 21:28:45.943274 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:45.943244 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8jcqc_c6128194-41ca-4dbf-a538-c346ec94bd50/cni-plugins/0.log" Apr 16 21:28:45.964385 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:45.964356 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8jcqc_c6128194-41ca-4dbf-a538-c346ec94bd50/bond-cni-plugin/0.log" Apr 16 21:28:45.985221 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:45.985196 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8jcqc_c6128194-41ca-4dbf-a538-c346ec94bd50/routeoverride-cni/0.log" Apr 16 21:28:46.007566 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:46.007546 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8jcqc_c6128194-41ca-4dbf-a538-c346ec94bd50/whereabouts-cni-bincopy/0.log" Apr 16 21:28:46.029154 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:46.029133 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8jcqc_c6128194-41ca-4dbf-a538-c346ec94bd50/whereabouts-cni/0.log" Apr 16 21:28:46.398571 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:46.398531 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bbf6s_8d0ad4e1-5b83-4118-baa5-e8531b28ca54/kube-multus/0.log" Apr 16 21:28:46.555188 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:46.555156 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zbj49_85e4090a-8dbc-4412-b20a-23d79d838363/network-metrics-daemon/0.log" Apr 16 21:28:46.574587 ip-10-0-141-171 kubenswrapper[2575]: I0416 21:28:46.574560 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zbj49_85e4090a-8dbc-4412-b20a-23d79d838363/kube-rbac-proxy/0.log"