Apr 21 04:36:19.051459 ip-10-0-140-11 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 04:36:19.051591 ip-10-0-140-11 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 04:36:19.051676 ip-10-0-140-11 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 04:36:19.052049 ip-10-0-140-11 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 04:36:29.108239 ip-10-0-140-11 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 04:36:29.108256 ip-10-0-140-11 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 6c847e6890244fdc985dd7b20bb1e4be -- Apr 21 04:38:33.473768 ip-10-0-140-11 systemd[1]: Starting Kubernetes Kubelet... Apr 21 04:38:33.976553 ip-10-0-140-11 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:38:33.976553 ip-10-0-140-11 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 04:38:33.976553 ip-10-0-140-11 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:38:33.976553 ip-10-0-140-11 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 04:38:33.976553 ip-10-0-140-11 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:38:33.978738 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.978652 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 04:38:33.985346 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985323 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:38:33.985346 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985341 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:38:33.985346 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985346 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:38:33.985346 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985349 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:38:33.985346 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985352 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985355 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985359 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985363 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985366 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985369 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985371 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985374 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985377 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985380 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985383 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985386 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985388 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985391 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985393 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985396 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985398 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985402 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985409 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:38:33.985563 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985412 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985415 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985417 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985420 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985422 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985425 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985427 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985430 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985432 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985436 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985439 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985442 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985444 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985447 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985450 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985455 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985458 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985461 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985464 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985467 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:38:33.986007 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985469 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985472 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985475 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985477 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985480 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985483 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985485 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985488 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985491 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985508 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985511 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985514 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985517 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985520 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985522 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985525 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985528 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985531 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985533 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:38:33.986569 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985536 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985539 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985542 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985544 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985547 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985551 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985553 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985557 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985561 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985565 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985569 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985572 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985575 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985578 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985581 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985584 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985588 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985591 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985593 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985596 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:38:33.987036 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985598 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985601 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985604 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985607 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985994 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.985999 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986002 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986005 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986008 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986011 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986014 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986019 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986023 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986027 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986030 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986033 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986036 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986039 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986042 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:38:33.987533 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986045 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986048 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986050 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986053 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986056 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986059 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986061 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986064 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986067 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986070 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986073 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986075 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986078 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986080 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986083 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986086 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986088 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986091 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986094 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:38:33.988002 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986096 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986099 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986102 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986104 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986107 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986110 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986113 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986116 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986118 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986122 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986124 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986127 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986129 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986132 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986134 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986137 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986139 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986142 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986144 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986147 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986149 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:38:33.988472 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986152 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986154 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986157 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986160 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986162 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986165 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986167 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986170 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986174 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986177 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986182 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986185 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986187 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986190 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986192 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986195 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986198 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986201 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986204 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986206 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:38:33.989086 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986209 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986211 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986215 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986218 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986221 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986223 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986226 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986228 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986231 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986233 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.986236 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989211 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989220 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989228 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989232 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989238 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989241 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989245 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989250 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989253 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989257 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 04:38:33.989589 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989260 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989263 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989266 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989269 2570 flags.go:64] FLAG: --cgroup-root="" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989272 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989275 2570 flags.go:64] FLAG: --client-ca-file="" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989278 2570 flags.go:64] FLAG: --cloud-config="" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989281 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989284 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989288 2570 flags.go:64] FLAG: --cluster-domain="" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989290 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989293 2570 flags.go:64] FLAG: --config-dir="" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989297 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989301 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989305 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989308 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989311 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989315 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989318 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989321 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989324 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989327 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989330 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989334 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989337 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 04:38:33.990100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989340 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989343 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989347 2570 flags.go:64] FLAG: --enable-server="true" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989350 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989354 2570 flags.go:64] FLAG: --event-burst="100" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989357 2570 flags.go:64] FLAG: --event-qps="50" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989360 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989363 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989366 2570 flags.go:64] FLAG: --eviction-hard="" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989370 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989373 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989376 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989378 2570 flags.go:64] FLAG: --eviction-soft="" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989381 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989384 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989387 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989391 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989394 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989397 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989400 2570 flags.go:64] FLAG: --feature-gates="" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989404 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989407 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989410 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989413 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989416 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 21 04:38:33.990717 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989419 2570 flags.go:64] FLAG: --help="false" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989422 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-140-11.ec2.internal" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989425 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989428 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989431 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989435 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989438 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989441 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989444 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989447 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989450 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989453 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989456 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989459 2570 flags.go:64] FLAG: --kube-reserved="" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989462 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989465 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989468 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989470 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989473 2570 flags.go:64] FLAG: --lock-file="" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989476 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989479 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989482 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989487 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 04:38:33.991315 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989490 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989506 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989509 2570 flags.go:64] FLAG: --logging-format="text" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989512 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989516 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989519 2570 flags.go:64] FLAG: --manifest-url="" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989522 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989527 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989530 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989534 2570 flags.go:64] FLAG: --max-pods="110" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989537 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989540 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989543 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989546 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989549 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989552 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989555 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989562 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989565 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989568 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989572 2570 flags.go:64] FLAG: --pod-cidr="" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989575 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989580 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989583 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 04:38:33.991870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989586 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989589 2570 flags.go:64] FLAG: --port="10250" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989592 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989596 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0319945534ef88493" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989599 2570 flags.go:64] FLAG: --qos-reserved="" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989602 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989605 2570 flags.go:64] FLAG: --register-node="true" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989608 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989611 2570 flags.go:64] FLAG: --register-with-taints="" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989615 2570 flags.go:64] FLAG: --registry-burst="10" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989618 2570 flags.go:64] FLAG: --registry-qps="5" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989621 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989624 2570 flags.go:64] FLAG: --reserved-memory="" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989628 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989631 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989634 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989637 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989640 2570 flags.go:64] FLAG: --runonce="false" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989643 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989646 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989649 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989652 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989655 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989658 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989661 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989664 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 04:38:33.992455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989667 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989670 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989673 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989676 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989680 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989682 2570 flags.go:64] FLAG: --system-cgroups="" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989685 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989691 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989693 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989697 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989701 2570 flags.go:64] FLAG: --tls-min-version="" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989704 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989707 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989709 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989713 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989716 2570 flags.go:64] FLAG: --v="2" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989720 2570 flags.go:64] FLAG: --version="false" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989724 2570 flags.go:64] FLAG: --vmodule="" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989730 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.989733 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989829 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989833 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989835 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989838 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:38:33.993120 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989841 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989843 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989846 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989849 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989851 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989854 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989857 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989859 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989862 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989864 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989867 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989871 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989875 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989879 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989881 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989884 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989887 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989890 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989893 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989896 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:38:33.993725 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989898 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989901 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989905 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989908 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989911 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989914 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989918 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989920 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989923 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989926 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989929 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989931 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989934 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989937 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989940 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989942 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989945 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989948 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989950 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989953 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:38:33.994259 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989955 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989958 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989960 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989963 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989965 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989968 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989971 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989973 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989976 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989979 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989981 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989984 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989986 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989989 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989993 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989995 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.989998 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990000 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990007 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990010 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:38:33.994757 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990013 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990015 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990018 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990022 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990025 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990028 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990031 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990034 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990036 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990039 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990041 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990044 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990046 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990049 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990052 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990054 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990057 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990059 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990062 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:38:33.995241 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990065 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:38:33.995737 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990068 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:38:33.995737 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.990071 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:38:33.995737 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.990076 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:38:33.997580 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.997560 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 04:38:33.997614 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.997581 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 04:38:33.997647 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997630 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:38:33.997647 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997636 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:38:33.997647 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997640 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:38:33.997647 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997643 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:38:33.997647 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997646 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997650 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997653 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997656 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997658 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997661 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997665 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997668 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997671 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997674 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997676 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997679 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997681 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997684 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997687 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997690 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997692 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997696 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997699 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:38:33.997774 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997703 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997707 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997712 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997715 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997718 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997720 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997723 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997726 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997729 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997731 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997734 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997737 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997739 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997742 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997744 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997748 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997751 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997754 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997757 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997759 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:38:33.998243 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997762 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997765 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997767 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997770 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997773 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997775 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997778 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997781 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997784 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997786 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997789 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997792 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997795 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997797 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997800 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997803 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997806 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997808 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997811 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997814 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:38:33.998747 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997816 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997819 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997821 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997824 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997826 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997829 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997832 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997835 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997837 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997840 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997842 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997844 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997847 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997850 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997852 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997855 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997857 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997860 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997862 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997865 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:38:33.999238 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997867 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:38:33.999842 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997870 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:38:33.999842 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997873 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:38:33.999842 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.997879 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:38:33.999842 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997976 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:38:33.999842 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997980 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:38:33.999842 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997983 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:38:33.999842 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997986 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:38:33.999842 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997989 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:38:33.999842 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997992 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:38:33.999842 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997995 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:38:33.999842 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.997998 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:38:33.999842 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998001 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:38:33.999842 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998003 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:38:33.999842 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998006 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:38:33.999842 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998008 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:38:33.999842 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998011 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998013 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998016 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998019 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998022 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998024 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998028 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998032 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998035 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998037 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998040 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998043 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998046 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998048 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998051 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998053 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998056 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998059 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998062 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:38:34.000252 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998064 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998067 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998069 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998072 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998075 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998077 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998080 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998082 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998085 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998088 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998091 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998093 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998095 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998098 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998102 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998105 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998109 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998112 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998115 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:38:34.000739 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998117 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998120 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998123 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998125 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998127 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998130 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998133 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998136 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998138 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998141 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998144 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998146 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998148 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998151 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998153 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998156 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998158 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998161 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998163 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998166 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:38:34.001200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998169 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:38:34.001709 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998172 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:38:34.001709 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998174 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:38:34.001709 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998177 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:38:34.001709 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998179 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:38:34.001709 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998181 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:38:34.001709 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998184 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:38:34.001709 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998186 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:38:34.001709 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998189 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:38:34.001709 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998192 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:38:34.001709 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998194 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:38:34.001709 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998197 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:38:34.001709 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998199 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:38:34.001709 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998202 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:38:34.001709 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998204 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:38:34.001709 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:33.998207 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:38:34.002070 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.998212 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:38:34.002070 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:33.998998 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 04:38:34.003242 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.003229 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 04:38:34.004216 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.004204 2570 server.go:1019] "Starting client certificate rotation" Apr 21 04:38:34.004313 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.004298 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 04:38:34.004346 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.004336 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 04:38:34.032222 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.032204 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 04:38:34.037824 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.036859 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 04:38:34.054484 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.054465 2570 log.go:25] "Validated CRI v1 runtime API" Apr 21 04:38:34.059576 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.059557 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 04:38:34.061042 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.061030 2570 log.go:25] "Validated CRI v1 image API" Apr 21 04:38:34.062395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.062380 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 04:38:34.066637 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.066617 2570 fs.go:135] Filesystem UUIDs: map[147efa95-5c45-433b-94f0-efe6ec759fd2:/dev/nvme0n1p4 5ad1a12a-0e79-4743-a699-3d176d96664d:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 21 04:38:34.066725 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.066636 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 04:38:34.073361 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.073250 2570 manager.go:217] Machine: {Timestamp:2026-04-21 04:38:34.071069197 +0000 UTC m=+0.465430928 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100012 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2870919323124f8dad960484a6a832 SystemUUID:ec287091-9323-124f-8dad-960484a6a832 BootID:6c847e68-9024-4fdc-985d-d7b20bb1e4be Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c8:b1:1f:62:23 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c8:b1:1f:62:23 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:de:af:15:f0:0d:d7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 04:38:34.073361 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.073356 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 04:38:34.073461 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.073437 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 04:38:34.075792 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.075767 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 04:38:34.075968 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.075793 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-11.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 04:38:34.076056 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.075982 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 04:38:34.076056 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.075995 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 04:38:34.076056 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.076013 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 04:38:34.077747 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.077734 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 04:38:34.079460 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.079447 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 21 04:38:34.079616 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.079604 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 04:38:34.083383 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.083371 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 21 04:38:34.083441 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.083388 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 04:38:34.083441 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.083405 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 04:38:34.083441 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.083420 2570 kubelet.go:397] "Adding apiserver pod source" Apr 21 04:38:34.083441 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.083441 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 04:38:34.084658 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.084644 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 04:38:34.084736 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.084668 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 04:38:34.087849 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.087834 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 04:38:34.089368 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.089355 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 04:38:34.091790 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.091421 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 04:38:34.091790 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.091437 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 04:38:34.091790 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.091443 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 04:38:34.091790 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.091448 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 04:38:34.091790 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.091454 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 04:38:34.091790 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.091460 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 04:38:34.091790 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.091465 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 04:38:34.091790 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.091471 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 04:38:34.091790 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.091477 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 04:38:34.091790 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.091483 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 04:38:34.091790 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.091503 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 04:38:34.091790 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.091511 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 04:38:34.092520 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.092510 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 04:38:34.092520 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.092521 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 04:38:34.096469 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.096432 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-11.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 04:38:34.096598 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.096572 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-11.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 04:38:34.096649 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.096616 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 04:38:34.096956 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.096945 2570 server.go:1295] "Started kubelet" Apr 21 04:38:34.097005 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.096931 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 04:38:34.097902 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.097866 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 04:38:34.097982 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.097848 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 04:38:34.097976 ip-10-0-140-11 systemd[1]: Started Kubernetes Kubelet. Apr 21 04:38:34.098145 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.098034 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 04:38:34.099329 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.099313 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 04:38:34.101757 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.101735 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 21 04:38:34.107108 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.106048 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-11.ec2.internal.18a8455bebb724ed default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-11.ec2.internal,UID:ip-10-0-140-11.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-11.ec2.internal,},FirstTimestamp:2026-04-21 04:38:34.096649453 +0000 UTC m=+0.491011186,LastTimestamp:2026-04-21 04:38:34.096649453 +0000 UTC m=+0.491011186,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-11.ec2.internal,}" Apr 21 04:38:34.108042 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.108026 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2rb8h" Apr 21 04:38:34.108363 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.108346 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 04:38:34.108628 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.108607 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 04:38:34.108972 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.108953 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 04:38:34.109864 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.109846 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 04:38:34.109864 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.109865 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 04:38:34.110015 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.109870 2570 factory.go:55] Registering systemd factory Apr 21 04:38:34.110015 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.109883 2570 factory.go:223] Registration of the systemd container factory successfully Apr 21 04:38:34.110015 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.109846 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 04:38:34.110015 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.109938 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-11.ec2.internal\" not found" Apr 21 04:38:34.110015 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.109970 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 21 04:38:34.110015 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.109979 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 21 04:38:34.110220 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.110101 2570 factory.go:153] Registering CRI-O factory Apr 21 04:38:34.110220 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.110116 2570 factory.go:223] Registration of the crio container factory successfully Apr 21 04:38:34.110220 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.110170 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 04:38:34.110220 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.110209 2570 factory.go:103] Registering Raw factory Apr 21 04:38:34.110340 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.110225 2570 manager.go:1196] Started watching for new ooms in manager Apr 21 04:38:34.110849 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.110833 2570 manager.go:319] Starting recovery of all containers Apr 21 04:38:34.115640 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.115619 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2rb8h" Apr 21 04:38:34.119817 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.119643 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 04:38:34.119817 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.119675 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-11.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 04:38:34.120907 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.120894 2570 manager.go:324] Recovery completed Apr 21 04:38:34.124651 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.124635 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:38:34.127404 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.127383 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:38:34.127462 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.127410 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:38:34.127462 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.127423 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:38:34.127879 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.127865 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 04:38:34.127879 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.127879 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 04:38:34.128013 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.127896 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 21 04:38:34.130538 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.130525 2570 policy_none.go:49] "None policy: Start" Apr 21 04:38:34.130627 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.130542 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 04:38:34.130627 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.130555 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 21 04:38:34.173112 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.173092 2570 manager.go:341] "Starting Device Plugin manager" Apr 21 04:38:34.182208 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.173133 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 04:38:34.182208 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.173147 2570 server.go:85] "Starting device plugin registration server" Apr 21 04:38:34.182208 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.173405 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 04:38:34.182208 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.173418 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 04:38:34.182208 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.173549 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 04:38:34.182208 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.173639 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 04:38:34.182208 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.173648 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 04:38:34.182208 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.174098 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 04:38:34.182208 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.174127 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-11.ec2.internal\" not found" Apr 21 04:38:34.213600 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.213576 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 04:38:34.214740 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.214726 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 04:38:34.214792 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.214752 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 04:38:34.214792 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.214769 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 04:38:34.214792 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.214778 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 04:38:34.214944 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.214811 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 04:38:34.217588 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.217567 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:38:34.273579 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.273517 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:38:34.274368 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.274354 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:38:34.274426 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.274384 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:38:34.274426 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.274399 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:38:34.274490 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.274427 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-11.ec2.internal" Apr 21 04:38:34.282520 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.282503 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-11.ec2.internal" Apr 21 04:38:34.282578 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.282525 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-11.ec2.internal\": node \"ip-10-0-140-11.ec2.internal\" not found" Apr 21 04:38:34.300958 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.300937 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-11.ec2.internal\" not found" Apr 21 04:38:34.315256 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.315225 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-11.ec2.internal"] Apr 21 04:38:34.315320 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.315296 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:38:34.316151 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.316138 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:38:34.316193 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.316163 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:38:34.316193 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.316173 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:38:34.317171 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.317159 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:38:34.317336 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.317321 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal" Apr 21 04:38:34.317372 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.317351 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:38:34.317848 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.317828 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:38:34.317947 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.317854 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:38:34.317947 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.317833 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:38:34.317947 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.317875 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:38:34.317947 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.317893 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:38:34.317947 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.317906 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:38:34.318828 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.318809 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-11.ec2.internal" Apr 21 04:38:34.318911 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.318833 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:38:34.319510 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.319475 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:38:34.319575 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.319522 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:38:34.319575 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.319537 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:38:34.349158 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.349137 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-11.ec2.internal\" not found" node="ip-10-0-140-11.ec2.internal" Apr 21 04:38:34.353375 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.353358 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-11.ec2.internal\" not found" node="ip-10-0-140-11.ec2.internal" Apr 21 04:38:34.401405 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.401385 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-11.ec2.internal\" not found" Apr 21 04:38:34.411331 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.411313 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ef1d1c459c108254ec652c02cdc8b3f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal\" (UID: \"2ef1d1c459c108254ec652c02cdc8b3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal" Apr 21 04:38:34.411393 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.411336 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b067c9cf5db8c3de32f82b49fa084d46-config\") pod \"kube-apiserver-proxy-ip-10-0-140-11.ec2.internal\" (UID: \"b067c9cf5db8c3de32f82b49fa084d46\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-11.ec2.internal" Apr 21 04:38:34.411393 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.411351 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2ef1d1c459c108254ec652c02cdc8b3f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal\" (UID: \"2ef1d1c459c108254ec652c02cdc8b3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal" Apr 21 04:38:34.502080 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.502056 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-11.ec2.internal\" not found" Apr 21 04:38:34.512412 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.512394 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b067c9cf5db8c3de32f82b49fa084d46-config\") pod \"kube-apiserver-proxy-ip-10-0-140-11.ec2.internal\" (UID: \"b067c9cf5db8c3de32f82b49fa084d46\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-11.ec2.internal" Apr 21 04:38:34.512486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.512418 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2ef1d1c459c108254ec652c02cdc8b3f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal\" (UID: \"2ef1d1c459c108254ec652c02cdc8b3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal" Apr 21 04:38:34.512486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.512439 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ef1d1c459c108254ec652c02cdc8b3f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal\" (UID: \"2ef1d1c459c108254ec652c02cdc8b3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal" Apr 21 04:38:34.512486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.512480 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ef1d1c459c108254ec652c02cdc8b3f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal\" (UID: \"2ef1d1c459c108254ec652c02cdc8b3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal" Apr 21 04:38:34.512629 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.512516 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2ef1d1c459c108254ec652c02cdc8b3f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal\" (UID: \"2ef1d1c459c108254ec652c02cdc8b3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal" Apr 21 04:38:34.512629 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.512520 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b067c9cf5db8c3de32f82b49fa084d46-config\") pod \"kube-apiserver-proxy-ip-10-0-140-11.ec2.internal\" (UID: \"b067c9cf5db8c3de32f82b49fa084d46\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-11.ec2.internal" Apr 21 04:38:34.602868 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.602813 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-11.ec2.internal\" not found" Apr 21 04:38:34.651269 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.651247 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal" Apr 21 04:38:34.655621 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:34.655608 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-11.ec2.internal" Apr 21 04:38:34.702899 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.702867 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-11.ec2.internal\" not found" Apr 21 04:38:34.803344 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.803321 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-11.ec2.internal\" not found" Apr 21 04:38:34.903885 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:34.903820 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-11.ec2.internal\" not found" Apr 21 04:38:35.004394 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:35.004364 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 04:38:35.004887 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:35.004375 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-11.ec2.internal\" not found" Apr 21 04:38:35.004887 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:35.004519 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 04:38:35.104770 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:35.104747 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-11.ec2.internal\" not found" Apr 21 04:38:35.106984 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:35.106962 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:38:35.108730 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:35.108710 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 04:38:35.109883 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:35.109866 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal" Apr 21 04:38:35.118265 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:35.118235 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 04:33:34 +0000 UTC" deadline="2027-12-25 02:33:07.280363684 +0000 UTC" Apr 21 04:38:35.118358 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:35.118264 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14709h54m32.162103261s" Apr 21 04:38:35.120817 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:35.120799 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 04:38:35.123184 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:35.123169 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 04:38:35.124834 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:35.124820 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-11.ec2.internal" Apr 21 04:38:35.134107 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:35.134089 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 04:38:35.138273 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:35.138256 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-r9rwt" Apr 21 04:38:35.146260 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:35.146242 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-r9rwt" Apr 21 04:38:35.163851 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:35.163800 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:38:35.262752 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:35.262707 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb067c9cf5db8c3de32f82b49fa084d46.slice/crio-9b87b71e9a116adaeb0dfafce9558daae867fb7209e8397c8249738dbeda5454 WatchSource:0}: Error finding container 9b87b71e9a116adaeb0dfafce9558daae867fb7209e8397c8249738dbeda5454: Status 404 returned error can't find the container with id 9b87b71e9a116adaeb0dfafce9558daae867fb7209e8397c8249738dbeda5454 Apr 21 04:38:35.263047 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:35.263031 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ef1d1c459c108254ec652c02cdc8b3f.slice/crio-e09c9195fd6c65f9ae7a5e793c5dfeaeaf5fc098528c92e8e5eddaf7133de663 WatchSource:0}: Error finding container e09c9195fd6c65f9ae7a5e793c5dfeaeaf5fc098528c92e8e5eddaf7133de663: Status 404 returned error can't find the container with id e09c9195fd6c65f9ae7a5e793c5dfeaeaf5fc098528c92e8e5eddaf7133de663 Apr 21 04:38:35.268283 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:35.268264 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:38:35.472088 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:35.472010 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:38:36.085101 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.085022 2570 apiserver.go:52] "Watching apiserver" Apr 21 04:38:36.091873 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.091847 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 04:38:36.093923 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.093895 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal","openshift-multus/multus-jqrlr","openshift-multus/network-metrics-daemon-c478k","openshift-ovn-kubernetes/ovnkube-node-tvl5z","kube-system/kube-apiserver-proxy-ip-10-0-140-11.ec2.internal","openshift-dns/node-resolver-m6sh5","openshift-image-registry/node-ca-7cpf7","openshift-multus/multus-additional-cni-plugins-j8dxf","openshift-network-diagnostics/network-check-target-bctwd","openshift-network-operator/iptables-alerter-5kp22","kube-system/global-pull-secret-syncer-2wxxz","kube-system/konnectivity-agent-dw9qb","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf","openshift-cluster-node-tuning-operator/tuned-kqvp7"] Apr 21 04:38:36.096974 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.096953 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7cpf7" Apr 21 04:38:36.099176 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.099042 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:36.099176 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.099130 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wxxz" podUID="a8a94be7-29d2-46f0-af2a-5a46e5fe8810" Apr 21 04:38:36.099540 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.099488 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 04:38:36.099540 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.099532 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 04:38:36.099765 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.099743 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jhc9c\"" Apr 21 04:38:36.099765 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.099763 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 04:38:36.101274 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.101256 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:36.101368 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.101325 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:38:36.103782 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.103708 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.106143 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.105926 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 04:38:36.106143 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.105952 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vkg9t\"" Apr 21 04:38:36.106294 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.106179 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 04:38:36.106294 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.106264 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 04:38:36.106294 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.106266 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 04:38:36.106435 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.106390 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 04:38:36.106974 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.106866 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 04:38:36.108721 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.108445 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.111724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.110706 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:38:36.111724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.110933 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:36.111724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.111010 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qk6l9\"" Apr 21 04:38:36.111724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.111093 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m6sh5" Apr 21 04:38:36.111724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.111370 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 04:38:36.112008 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.111025 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bctwd" podUID="da67b91f-e17f-4c7a-a45a-dddc62350e0e" Apr 21 04:38:36.113648 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.113546 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-vz5cd\"" Apr 21 04:38:36.113648 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.113628 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 04:38:36.113804 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.113676 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 04:38:36.114356 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.114334 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.116282 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.116262 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 04:38:36.116414 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.116397 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 04:38:36.116506 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.116397 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 04:38:36.116572 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.116521 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-dmpdz\"" Apr 21 04:38:36.116735 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.116719 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.119003 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.118982 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 04:38:36.119153 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.118984 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 04:38:36.119289 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.119269 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-ckwn4\"" Apr 21 04:38:36.119391 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.119338 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 04:38:36.119557 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.119541 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 04:38:36.122395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.120867 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-sysconfig\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.122395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.120906 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-var-lib-kubelet\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.122395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.120932 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9bnt\" (UniqueName: \"kubernetes.io/projected/065d76ba-0457-43ad-a208-bdd2d77366ce-kube-api-access-x9bnt\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.122395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.120956 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c70f167b-0eff-4017-9272-7a887e981112-hosts-file\") pod \"node-resolver-m6sh5\" (UID: \"c70f167b-0eff-4017-9272-7a887e981112\") " pod="openshift-dns/node-resolver-m6sh5" Apr 21 04:38:36.122395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.120978 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c6ff4930-586a-401d-8bf7-787218f408d0-serviceca\") pod \"node-ca-7cpf7\" (UID: \"c6ff4930-586a-401d-8bf7-787218f408d0\") " pod="openshift-image-registry/node-ca-7cpf7" Apr 21 04:38:36.122395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121002 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.122395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121025 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/065d76ba-0457-43ad-a208-bdd2d77366ce-env-overrides\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.122395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121048 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd2ft\" (UniqueName: \"kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft\") pod \"network-check-target-bctwd\" (UID: \"da67b91f-e17f-4c7a-a45a-dddc62350e0e\") " pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:36.122395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121073 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-dbus\") pod \"global-pull-secret-syncer-2wxxz\" (UID: \"a8a94be7-29d2-46f0-af2a-5a46e5fe8810\") " pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:36.122395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121097 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret\") pod \"global-pull-secret-syncer-2wxxz\" (UID: \"a8a94be7-29d2-46f0-af2a-5a46e5fe8810\") " pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:36.122395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121119 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-kubelet\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.122395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121141 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-etc-openvswitch\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.122395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121163 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-cni-bin\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.122395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121186 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-cni-netd\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.122395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121208 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-modprobe-d\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.122395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121230 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-systemd\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.123060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121256 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-host\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.123060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121278 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f1b476ba-e89a-4760-8185-d97950e55be1-etc-tuned\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.123060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121303 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6ff4930-586a-401d-8bf7-787218f408d0-host\") pod \"node-ca-7cpf7\" (UID: \"c6ff4930-586a-401d-8bf7-787218f408d0\") " pod="openshift-image-registry/node-ca-7cpf7" Apr 21 04:38:36.123060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121338 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phc2f\" (UniqueName: \"kubernetes.io/projected/c6ff4930-586a-401d-8bf7-787218f408d0-kube-api-access-phc2f\") pod \"node-ca-7cpf7\" (UID: \"c6ff4930-586a-401d-8bf7-787218f408d0\") " pod="openshift-image-registry/node-ca-7cpf7" Apr 21 04:38:36.123060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121360 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/065d76ba-0457-43ad-a208-bdd2d77366ce-ovn-node-metrics-cert\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.123060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121384 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-kubernetes\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.123060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121404 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-sys\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.123060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121427 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdxgl\" (UniqueName: \"kubernetes.io/projected/f1b476ba-e89a-4760-8185-d97950e55be1-kube-api-access-bdxgl\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.123060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121453 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs\") pod \"network-metrics-daemon-c478k\" (UID: \"1500cffd-5994-4d2a-bd36-855f9cf3efe5\") " pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:36.123060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121475 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-run-netns\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.123060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121517 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-run-systemd\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.123060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121540 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-node-log\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.123060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121563 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-log-socket\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.123060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121596 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/065d76ba-0457-43ad-a208-bdd2d77366ce-ovnkube-script-lib\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.123060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121633 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f1b476ba-e89a-4760-8185-d97950e55be1-tmp\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.123060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121658 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-systemd-units\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.123060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121689 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-slash\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.123839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121724 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-var-lib-openvswitch\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.123839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121750 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-run-openvswitch\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.123839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121776 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.123839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121823 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-sysctl-conf\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.123839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121854 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-run\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.123839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121879 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-lib-modules\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.123839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121905 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw52h\" (UniqueName: \"kubernetes.io/projected/1500cffd-5994-4d2a-bd36-855f9cf3efe5-kube-api-access-mw52h\") pod \"network-metrics-daemon-c478k\" (UID: \"1500cffd-5994-4d2a-bd36-855f9cf3efe5\") " pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:36.123839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121954 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/065d76ba-0457-43ad-a208-bdd2d77366ce-ovnkube-config\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.123839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121988 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-sysctl-d\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.123839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.122016 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-run-ovn\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.123839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.122041 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c70f167b-0eff-4017-9272-7a887e981112-tmp-dir\") pod \"node-resolver-m6sh5\" (UID: \"c70f167b-0eff-4017-9272-7a887e981112\") " pod="openshift-dns/node-resolver-m6sh5" Apr 21 04:38:36.123839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.122077 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpdng\" (UniqueName: \"kubernetes.io/projected/c70f167b-0eff-4017-9272-7a887e981112-kube-api-access-tpdng\") pod \"node-resolver-m6sh5\" (UID: \"c70f167b-0eff-4017-9272-7a887e981112\") " pod="openshift-dns/node-resolver-m6sh5" Apr 21 04:38:36.123839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.121824 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.123839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.122200 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-kubelet-config\") pod \"global-pull-secret-syncer-2wxxz\" (UID: \"a8a94be7-29d2-46f0-af2a-5a46e5fe8810\") " pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:36.124570 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.124270 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-5p9gd\"" Apr 21 04:38:36.124570 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.124479 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 04:38:36.124666 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.124556 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 04:38:36.124900 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.124879 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dw9qb" Apr 21 04:38:36.124974 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.124910 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5kp22" Apr 21 04:38:36.127410 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.127156 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:38:36.127410 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.127291 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 04:38:36.127615 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.127412 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-6kmv9\"" Apr 21 04:38:36.127615 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.127472 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 04:38:36.127825 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.127807 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 04:38:36.127943 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.127826 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-smttk\"" Apr 21 04:38:36.128191 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.128177 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 04:38:36.147723 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.147694 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 04:33:35 +0000 UTC" deadline="2027-12-20 16:18:03.066087651 +0000 UTC" Apr 21 04:38:36.147723 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.147724 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14603h39m26.918367619s" Apr 21 04:38:36.210732 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.210702 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 04:38:36.219397 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.219354 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal" event={"ID":"2ef1d1c459c108254ec652c02cdc8b3f","Type":"ContainerStarted","Data":"e09c9195fd6c65f9ae7a5e793c5dfeaeaf5fc098528c92e8e5eddaf7133de663"} Apr 21 04:38:36.220562 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.220527 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-11.ec2.internal" event={"ID":"b067c9cf5db8c3de32f82b49fa084d46","Type":"ContainerStarted","Data":"9b87b71e9a116adaeb0dfafce9558daae867fb7209e8397c8249738dbeda5454"} Apr 21 04:38:36.222899 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.222877 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2ft\" (UniqueName: \"kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft\") pod \"network-check-target-bctwd\" (UID: \"da67b91f-e17f-4c7a-a45a-dddc62350e0e\") " pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:36.222996 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.222915 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-dbus\") pod \"global-pull-secret-syncer-2wxxz\" (UID: \"a8a94be7-29d2-46f0-af2a-5a46e5fe8810\") " pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:36.222996 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.222937 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret\") pod \"global-pull-secret-syncer-2wxxz\" (UID: \"a8a94be7-29d2-46f0-af2a-5a46e5fe8810\") " pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:36.222996 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.222967 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-sys-fs\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.222996 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.222993 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a2b122c8-53b3-4280-9f62-b777ac256ac3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.223203 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223020 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-kubelet\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.223203 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223045 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-systemd\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.223203 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223070 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-os-release\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.223203 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223095 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/065d76ba-0457-43ad-a208-bdd2d77366ce-ovn-node-metrics-cert\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.223203 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223120 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-sys\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.223203 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223124 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-dbus\") pod \"global-pull-secret-syncer-2wxxz\" (UID: \"a8a94be7-29d2-46f0-af2a-5a46e5fe8810\") " pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:36.223203 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223145 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3614312c-c7b0-4801-8358-d1d3c8043ef9-host-slash\") pod \"iptables-alerter-5kp22\" (UID: \"3614312c-c7b0-4801-8358-d1d3c8043ef9\") " pod="openshift-network-operator/iptables-alerter-5kp22" Apr 21 04:38:36.223203 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223165 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs\") pod \"network-metrics-daemon-c478k\" (UID: \"1500cffd-5994-4d2a-bd36-855f9cf3efe5\") " pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:36.223203 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223179 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-run-systemd\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.223203 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223191 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-kubelet\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.223203 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223206 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/065d76ba-0457-43ad-a208-bdd2d77366ce-ovnkube-script-lib\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.223724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223221 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f1b476ba-e89a-4760-8185-d97950e55be1-tmp\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.223724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223236 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3d76811b-93de-4955-b346-ce731491aa8c-cni-binary-copy\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.223724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223272 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-device-dir\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.223724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223300 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6282aefe-100f-4587-93df-5faf16b1e100-agent-certs\") pod \"konnectivity-agent-dw9qb\" (UID: \"6282aefe-100f-4587-93df-5faf16b1e100\") " pod="kube-system/konnectivity-agent-dw9qb" Apr 21 04:38:36.223724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223316 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-systemd-units\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.223724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223332 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-var-lib-openvswitch\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.223724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223347 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-run-openvswitch\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.223724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223362 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.223724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223378 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-run\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.223724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223396 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-var-lib-cni-bin\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.223724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223410 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a2b122c8-53b3-4280-9f62-b777ac256ac3-cnibin\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.223724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223457 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mw52h\" (UniqueName: \"kubernetes.io/projected/1500cffd-5994-4d2a-bd36-855f9cf3efe5-kube-api-access-mw52h\") pod \"network-metrics-daemon-c478k\" (UID: \"1500cffd-5994-4d2a-bd36-855f9cf3efe5\") " pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:36.223724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223483 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/065d76ba-0457-43ad-a208-bdd2d77366ce-ovnkube-config\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.223724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223523 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-registration-dir\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.223724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223529 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 04:38:36.223724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223552 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-run-ovn\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.223724 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223578 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-multus-conf-dir\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.224600 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223583 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-sys\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.224600 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223602 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.224600 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223630 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9bnt\" (UniqueName: \"kubernetes.io/projected/065d76ba-0457-43ad-a208-bdd2d77366ce-kube-api-access-x9bnt\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.224600 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.223316 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:38:36.224600 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.223749 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:38:36.224600 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.223848 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs podName:1500cffd-5994-4d2a-bd36-855f9cf3efe5 nodeName:}" failed. No retries permitted until 2026-04-21 04:38:36.723783182 +0000 UTC m=+3.118144925 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs") pod "network-metrics-daemon-c478k" (UID: "1500cffd-5994-4d2a-bd36-855f9cf3efe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:38:36.224600 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.223911 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-systemd-units\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.224600 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.223965 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret podName:a8a94be7-29d2-46f0-af2a-5a46e5fe8810 nodeName:}" failed. No retries permitted until 2026-04-21 04:38:36.723948494 +0000 UTC m=+3.118310217 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret") pod "global-pull-secret-syncer-2wxxz" (UID: "a8a94be7-29d2-46f0-af2a-5a46e5fe8810") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:38:36.224600 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224056 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-run\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.224600 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224097 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-run-systemd\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.224600 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224254 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-run-openvswitch\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.224600 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224290 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-var-lib-openvswitch\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.224600 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224369 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-systemd\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.224600 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224400 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-system-cni-dir\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.224600 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224423 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-run-k8s-cni-cncf-io\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.224600 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224445 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-run-netns\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.224600 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224480 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c6ff4930-586a-401d-8bf7-787218f408d0-serviceca\") pod \"node-ca-7cpf7\" (UID: \"c6ff4930-586a-401d-8bf7-787218f408d0\") " pod="openshift-image-registry/node-ca-7cpf7" Apr 21 04:38:36.225486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224519 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.225486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224542 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/065d76ba-0457-43ad-a208-bdd2d77366ce-env-overrides\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.225486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224581 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/065d76ba-0457-43ad-a208-bdd2d77366ce-ovnkube-config\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.225486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224656 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-run-ovn\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.225486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224660 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/065d76ba-0457-43ad-a208-bdd2d77366ce-ovnkube-script-lib\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.225486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224583 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3d76811b-93de-4955-b346-ce731491aa8c-multus-daemon-config\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.225486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224694 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.225486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224744 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-etc-openvswitch\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.225486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224776 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-cni-bin\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.225486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224793 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.225486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224833 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-cni-netd\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.225486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224860 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-modprobe-d\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.225486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224886 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-host\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.225486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224893 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-etc-openvswitch\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.225486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224932 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-cni-netd\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.225486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224932 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-cni-bin\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.225486 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.224956 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f1b476ba-e89a-4760-8185-d97950e55be1-etc-tuned\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225004 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c6ff4930-586a-401d-8bf7-787218f408d0-serviceca\") pod \"node-ca-7cpf7\" (UID: \"c6ff4930-586a-401d-8bf7-787218f408d0\") " pod="openshift-image-registry/node-ca-7cpf7" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225021 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-host\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225031 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-modprobe-d\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225082 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6ff4930-586a-401d-8bf7-787218f408d0-host\") pod \"node-ca-7cpf7\" (UID: \"c6ff4930-586a-401d-8bf7-787218f408d0\") " pod="openshift-image-registry/node-ca-7cpf7" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225129 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phc2f\" (UniqueName: \"kubernetes.io/projected/c6ff4930-586a-401d-8bf7-787218f408d0-kube-api-access-phc2f\") pod \"node-ca-7cpf7\" (UID: \"c6ff4930-586a-401d-8bf7-787218f408d0\") " pod="openshift-image-registry/node-ca-7cpf7" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225173 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6ff4930-586a-401d-8bf7-787218f408d0-host\") pod \"node-ca-7cpf7\" (UID: \"c6ff4930-586a-401d-8bf7-787218f408d0\") " pod="openshift-image-registry/node-ca-7cpf7" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225194 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-kubernetes\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225273 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/065d76ba-0457-43ad-a208-bdd2d77366ce-env-overrides\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225315 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdxgl\" (UniqueName: \"kubernetes.io/projected/f1b476ba-e89a-4760-8185-d97950e55be1-kube-api-access-bdxgl\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225334 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-kubernetes\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225409 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-etc-kubernetes\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225536 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-run-netns\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225568 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-node-log\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225595 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-log-socket\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225663 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-node-log\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225662 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fx75\" (UniqueName: \"kubernetes.io/projected/3d76811b-93de-4955-b346-ce731491aa8c-kube-api-access-5fx75\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225628 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-run-netns\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.226306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225732 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a2b122c8-53b3-4280-9f62-b777ac256ac3-os-release\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.226973 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225736 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-log-socket\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.226973 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225770 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a2b122c8-53b3-4280-9f62-b777ac256ac3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.226973 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225799 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6282aefe-100f-4587-93df-5faf16b1e100-konnectivity-ca\") pod \"konnectivity-agent-dw9qb\" (UID: \"6282aefe-100f-4587-93df-5faf16b1e100\") " pod="kube-system/konnectivity-agent-dw9qb" Apr 21 04:38:36.226973 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225826 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-slash\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.226973 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225850 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-sysctl-conf\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.226973 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225877 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-lib-modules\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.226973 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225907 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-etc-selinux\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.226973 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225933 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxlb4\" (UniqueName: \"kubernetes.io/projected/80ce6df8-dc91-474a-8b60-0ddac660fee8-kube-api-access-lxlb4\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.226973 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.225961 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-sysctl-d\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.226973 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226066 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-sysctl-d\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.226973 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226089 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-sysctl-conf\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.226973 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226124 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-cnibin\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.226973 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226150 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-lib-modules\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.226973 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226154 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-var-lib-kubelet\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.226973 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226181 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-run-multus-certs\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.226973 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226196 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/065d76ba-0457-43ad-a208-bdd2d77366ce-host-slash\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.226973 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226207 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-socket-dir\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.227596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226234 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj86x\" (UniqueName: \"kubernetes.io/projected/3614312c-c7b0-4801-8358-d1d3c8043ef9-kube-api-access-hj86x\") pod \"iptables-alerter-5kp22\" (UID: \"3614312c-c7b0-4801-8358-d1d3c8043ef9\") " pod="openshift-network-operator/iptables-alerter-5kp22" Apr 21 04:38:36.227596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226260 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a2b122c8-53b3-4280-9f62-b777ac256ac3-system-cni-dir\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.227596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226286 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a2b122c8-53b3-4280-9f62-b777ac256ac3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.227596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226325 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c70f167b-0eff-4017-9272-7a887e981112-tmp-dir\") pod \"node-resolver-m6sh5\" (UID: \"c70f167b-0eff-4017-9272-7a887e981112\") " pod="openshift-dns/node-resolver-m6sh5" Apr 21 04:38:36.227596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226352 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpdng\" (UniqueName: \"kubernetes.io/projected/c70f167b-0eff-4017-9272-7a887e981112-kube-api-access-tpdng\") pod \"node-resolver-m6sh5\" (UID: \"c70f167b-0eff-4017-9272-7a887e981112\") " pod="openshift-dns/node-resolver-m6sh5" Apr 21 04:38:36.227596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226378 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-kubelet-config\") pod \"global-pull-secret-syncer-2wxxz\" (UID: \"a8a94be7-29d2-46f0-af2a-5a46e5fe8810\") " pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:36.227596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226404 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-sysconfig\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.227596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226450 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-var-lib-kubelet\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.227596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226510 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-multus-cni-dir\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.227596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226520 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-etc-sysconfig\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.227596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226536 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-multus-socket-dir-parent\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.227596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226579 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-kubelet-config\") pod \"global-pull-secret-syncer-2wxxz\" (UID: \"a8a94be7-29d2-46f0-af2a-5a46e5fe8810\") " pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:36.227596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226622 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c70f167b-0eff-4017-9272-7a887e981112-hosts-file\") pod \"node-resolver-m6sh5\" (UID: \"c70f167b-0eff-4017-9272-7a887e981112\") " pod="openshift-dns/node-resolver-m6sh5" Apr 21 04:38:36.227596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226653 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-var-lib-cni-multus\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.227596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226670 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1b476ba-e89a-4760-8185-d97950e55be1-var-lib-kubelet\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.227596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226680 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-hostroot\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.227596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226710 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3614312c-c7b0-4801-8358-d1d3c8043ef9-iptables-alerter-script\") pod \"iptables-alerter-5kp22\" (UID: \"3614312c-c7b0-4801-8358-d1d3c8043ef9\") " pod="openshift-network-operator/iptables-alerter-5kp22" Apr 21 04:38:36.228119 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226729 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c70f167b-0eff-4017-9272-7a887e981112-hosts-file\") pod \"node-resolver-m6sh5\" (UID: \"c70f167b-0eff-4017-9272-7a887e981112\") " pod="openshift-dns/node-resolver-m6sh5" Apr 21 04:38:36.228119 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226735 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a2b122c8-53b3-4280-9f62-b777ac256ac3-cni-binary-copy\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.228119 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226792 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7w9c\" (UniqueName: \"kubernetes.io/projected/a2b122c8-53b3-4280-9f62-b777ac256ac3-kube-api-access-k7w9c\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.228119 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.226949 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c70f167b-0eff-4017-9272-7a887e981112-tmp-dir\") pod \"node-resolver-m6sh5\" (UID: \"c70f167b-0eff-4017-9272-7a887e981112\") " pod="openshift-dns/node-resolver-m6sh5" Apr 21 04:38:36.228119 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.227561 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f1b476ba-e89a-4760-8185-d97950e55be1-tmp\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.228836 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.228806 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/065d76ba-0457-43ad-a208-bdd2d77366ce-ovn-node-metrics-cert\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.229517 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.229433 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f1b476ba-e89a-4760-8185-d97950e55be1-etc-tuned\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.239805 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.239787 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:38:36.239900 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.239809 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:38:36.239900 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.239822 2570 projected.go:194] Error preparing data for projected volume kube-api-access-fd2ft for pod openshift-network-diagnostics/network-check-target-bctwd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:38:36.239900 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.239868 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft podName:da67b91f-e17f-4c7a-a45a-dddc62350e0e nodeName:}" failed. No retries permitted until 2026-04-21 04:38:36.739855289 +0000 UTC m=+3.134217010 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fd2ft" (UniqueName: "kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft") pod "network-check-target-bctwd" (UID: "da67b91f-e17f-4c7a-a45a-dddc62350e0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:38:36.241625 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.241602 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw52h\" (UniqueName: \"kubernetes.io/projected/1500cffd-5994-4d2a-bd36-855f9cf3efe5-kube-api-access-mw52h\") pod \"network-metrics-daemon-c478k\" (UID: \"1500cffd-5994-4d2a-bd36-855f9cf3efe5\") " pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:36.245211 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.245187 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdxgl\" (UniqueName: \"kubernetes.io/projected/f1b476ba-e89a-4760-8185-d97950e55be1-kube-api-access-bdxgl\") pod \"tuned-kqvp7\" (UID: \"f1b476ba-e89a-4760-8185-d97950e55be1\") " pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.245293 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.245231 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9bnt\" (UniqueName: \"kubernetes.io/projected/065d76ba-0457-43ad-a208-bdd2d77366ce-kube-api-access-x9bnt\") pod \"ovnkube-node-tvl5z\" (UID: \"065d76ba-0457-43ad-a208-bdd2d77366ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.245918 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.245898 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpdng\" (UniqueName: \"kubernetes.io/projected/c70f167b-0eff-4017-9272-7a887e981112-kube-api-access-tpdng\") pod \"node-resolver-m6sh5\" (UID: \"c70f167b-0eff-4017-9272-7a887e981112\") " pod="openshift-dns/node-resolver-m6sh5" Apr 21 04:38:36.245984 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.245945 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phc2f\" (UniqueName: \"kubernetes.io/projected/c6ff4930-586a-401d-8bf7-787218f408d0-kube-api-access-phc2f\") pod \"node-ca-7cpf7\" (UID: \"c6ff4930-586a-401d-8bf7-787218f408d0\") " pod="openshift-image-registry/node-ca-7cpf7" Apr 21 04:38:36.327962 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.327937 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3d76811b-93de-4955-b346-ce731491aa8c-cni-binary-copy\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.328108 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.327974 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-device-dir\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.328108 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.327996 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6282aefe-100f-4587-93df-5faf16b1e100-agent-certs\") pod \"konnectivity-agent-dw9qb\" (UID: \"6282aefe-100f-4587-93df-5faf16b1e100\") " pod="kube-system/konnectivity-agent-dw9qb" Apr 21 04:38:36.328108 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328017 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-var-lib-cni-bin\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.328108 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328065 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a2b122c8-53b3-4280-9f62-b777ac256ac3-cnibin\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.328319 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328109 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-registration-dir\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.328319 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328111 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-device-dir\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.328319 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328125 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a2b122c8-53b3-4280-9f62-b777ac256ac3-cnibin\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.328319 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328070 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-var-lib-cni-bin\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.328319 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328137 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-multus-conf-dir\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.328319 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328167 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-registration-dir\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.328319 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328173 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-multus-conf-dir\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.328319 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328193 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.328319 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328222 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-system-cni-dir\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.328319 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328246 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-run-k8s-cni-cncf-io\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.328319 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328271 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-run-netns\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.328319 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328287 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.328319 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328298 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3d76811b-93de-4955-b346-ce731491aa8c-multus-daemon-config\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.328319 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328303 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-system-cni-dir\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.328948 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328331 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-run-k8s-cni-cncf-io\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.328948 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328369 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-etc-kubernetes\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.328948 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328373 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-run-netns\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.328948 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328399 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fx75\" (UniqueName: \"kubernetes.io/projected/3d76811b-93de-4955-b346-ce731491aa8c-kube-api-access-5fx75\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.328948 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328412 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-etc-kubernetes\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.328948 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328426 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a2b122c8-53b3-4280-9f62-b777ac256ac3-os-release\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.328948 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328451 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a2b122c8-53b3-4280-9f62-b777ac256ac3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.328948 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328478 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6282aefe-100f-4587-93df-5faf16b1e100-konnectivity-ca\") pod \"konnectivity-agent-dw9qb\" (UID: \"6282aefe-100f-4587-93df-5faf16b1e100\") " pod="kube-system/konnectivity-agent-dw9qb" Apr 21 04:38:36.328948 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328520 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-etc-selinux\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.328948 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328549 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxlb4\" (UniqueName: \"kubernetes.io/projected/80ce6df8-dc91-474a-8b60-0ddac660fee8-kube-api-access-lxlb4\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.328948 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328576 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-cnibin\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.328948 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328601 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-var-lib-kubelet\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.328948 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328614 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a2b122c8-53b3-4280-9f62-b777ac256ac3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.328948 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328549 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a2b122c8-53b3-4280-9f62-b777ac256ac3-os-release\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.329618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.328849 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-run-multus-certs\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.329618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329006 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-socket-dir\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.329618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329048 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hj86x\" (UniqueName: \"kubernetes.io/projected/3614312c-c7b0-4801-8358-d1d3c8043ef9-kube-api-access-hj86x\") pod \"iptables-alerter-5kp22\" (UID: \"3614312c-c7b0-4801-8358-d1d3c8043ef9\") " pod="openshift-network-operator/iptables-alerter-5kp22" Apr 21 04:38:36.329618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329083 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a2b122c8-53b3-4280-9f62-b777ac256ac3-system-cni-dir\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.329618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329119 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a2b122c8-53b3-4280-9f62-b777ac256ac3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.329618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329171 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-multus-cni-dir\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.329618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329191 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3d76811b-93de-4955-b346-ce731491aa8c-multus-daemon-config\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.329618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329198 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-multus-socket-dir-parent\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.329618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329231 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-var-lib-cni-multus\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.329618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329237 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3d76811b-93de-4955-b346-ce731491aa8c-cni-binary-copy\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.329618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329267 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-hostroot\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.329618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329277 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-var-lib-cni-multus\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.329618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329287 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3614312c-c7b0-4801-8358-d1d3c8043ef9-iptables-alerter-script\") pod \"iptables-alerter-5kp22\" (UID: \"3614312c-c7b0-4801-8358-d1d3c8043ef9\") " pod="openshift-network-operator/iptables-alerter-5kp22" Apr 21 04:38:36.329618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329310 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a2b122c8-53b3-4280-9f62-b777ac256ac3-cni-binary-copy\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.329618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329327 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-cnibin\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.329618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329331 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7w9c\" (UniqueName: \"kubernetes.io/projected/a2b122c8-53b3-4280-9f62-b777ac256ac3-kube-api-access-k7w9c\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.329618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329359 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-var-lib-kubelet\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.330351 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329406 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-sys-fs\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.330351 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329427 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a2b122c8-53b3-4280-9f62-b777ac256ac3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.330351 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329447 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-os-release\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.330351 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329467 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3614312c-c7b0-4801-8358-d1d3c8043ef9-host-slash\") pod \"iptables-alerter-5kp22\" (UID: \"3614312c-c7b0-4801-8358-d1d3c8043ef9\") " pod="openshift-network-operator/iptables-alerter-5kp22" Apr 21 04:38:36.330351 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329648 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3614312c-c7b0-4801-8358-d1d3c8043ef9-host-slash\") pod \"iptables-alerter-5kp22\" (UID: \"3614312c-c7b0-4801-8358-d1d3c8043ef9\") " pod="openshift-network-operator/iptables-alerter-5kp22" Apr 21 04:38:36.330351 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329705 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-host-run-multus-certs\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.330351 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329737 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-hostroot\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.330351 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.329806 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-socket-dir\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.330351 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.330009 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a2b122c8-53b3-4280-9f62-b777ac256ac3-system-cni-dir\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.330351 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.330111 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3614312c-c7b0-4801-8358-d1d3c8043ef9-iptables-alerter-script\") pod \"iptables-alerter-5kp22\" (UID: \"3614312c-c7b0-4801-8358-d1d3c8043ef9\") " pod="openshift-network-operator/iptables-alerter-5kp22" Apr 21 04:38:36.330799 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.330406 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a2b122c8-53b3-4280-9f62-b777ac256ac3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.330799 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.330408 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a2b122c8-53b3-4280-9f62-b777ac256ac3-cni-binary-copy\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.330799 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.330479 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-multus-cni-dir\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.330799 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.330558 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-multus-socket-dir-parent\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.330799 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.330611 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-sys-fs\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.330799 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.330668 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6282aefe-100f-4587-93df-5faf16b1e100-konnectivity-ca\") pod \"konnectivity-agent-dw9qb\" (UID: \"6282aefe-100f-4587-93df-5faf16b1e100\") " pod="kube-system/konnectivity-agent-dw9qb" Apr 21 04:38:36.330799 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.330714 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/80ce6df8-dc91-474a-8b60-0ddac660fee8-etc-selinux\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.330799 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.330793 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3d76811b-93de-4955-b346-ce731491aa8c-os-release\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.331120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.331041 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a2b122c8-53b3-4280-9f62-b777ac256ac3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.331120 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.331105 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6282aefe-100f-4587-93df-5faf16b1e100-agent-certs\") pod \"konnectivity-agent-dw9qb\" (UID: \"6282aefe-100f-4587-93df-5faf16b1e100\") " pod="kube-system/konnectivity-agent-dw9qb" Apr 21 04:38:36.349247 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.349189 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fx75\" (UniqueName: \"kubernetes.io/projected/3d76811b-93de-4955-b346-ce731491aa8c-kube-api-access-5fx75\") pod \"multus-jqrlr\" (UID: \"3d76811b-93de-4955-b346-ce731491aa8c\") " pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.349512 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.349475 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxlb4\" (UniqueName: \"kubernetes.io/projected/80ce6df8-dc91-474a-8b60-0ddac660fee8-kube-api-access-lxlb4\") pod \"aws-ebs-csi-driver-node-sj7wf\" (UID: \"80ce6df8-dc91-474a-8b60-0ddac660fee8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.349755 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.349730 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj86x\" (UniqueName: \"kubernetes.io/projected/3614312c-c7b0-4801-8358-d1d3c8043ef9-kube-api-access-hj86x\") pod \"iptables-alerter-5kp22\" (UID: \"3614312c-c7b0-4801-8358-d1d3c8043ef9\") " pod="openshift-network-operator/iptables-alerter-5kp22" Apr 21 04:38:36.351086 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.351067 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7w9c\" (UniqueName: \"kubernetes.io/projected/a2b122c8-53b3-4280-9f62-b777ac256ac3-kube-api-access-k7w9c\") pod \"multus-additional-cni-plugins-j8dxf\" (UID: \"a2b122c8-53b3-4280-9f62-b777ac256ac3\") " pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.407862 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.407823 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7cpf7" Apr 21 04:38:36.424629 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.424606 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:38:36.435290 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.435272 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" Apr 21 04:38:36.439857 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.439841 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dw9qb" Apr 21 04:38:36.450410 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.450386 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5kp22" Apr 21 04:38:36.456942 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.456926 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" Apr 21 04:38:36.464485 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.464466 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jqrlr" Apr 21 04:38:36.473085 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.473068 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j8dxf" Apr 21 04:38:36.480590 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.480562 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m6sh5" Apr 21 04:38:36.535534 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.535507 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:38:36.732415 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.732346 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret\") pod \"global-pull-secret-syncer-2wxxz\" (UID: \"a8a94be7-29d2-46f0-af2a-5a46e5fe8810\") " pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:36.732415 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.732381 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs\") pod \"network-metrics-daemon-c478k\" (UID: \"1500cffd-5994-4d2a-bd36-855f9cf3efe5\") " pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:36.732591 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.732466 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:38:36.732591 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.732476 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:38:36.732591 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.732531 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs podName:1500cffd-5994-4d2a-bd36-855f9cf3efe5 nodeName:}" failed. No retries permitted until 2026-04-21 04:38:37.732516931 +0000 UTC m=+4.126878649 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs") pod "network-metrics-daemon-c478k" (UID: "1500cffd-5994-4d2a-bd36-855f9cf3efe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:38:36.732591 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.732545 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret podName:a8a94be7-29d2-46f0-af2a-5a46e5fe8810 nodeName:}" failed. No retries permitted until 2026-04-21 04:38:37.732539464 +0000 UTC m=+4.126901182 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret") pod "global-pull-secret-syncer-2wxxz" (UID: "a8a94be7-29d2-46f0-af2a-5a46e5fe8810") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:38:36.833268 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:36.833240 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2ft\" (UniqueName: \"kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft\") pod \"network-check-target-bctwd\" (UID: \"da67b91f-e17f-4c7a-a45a-dddc62350e0e\") " pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:36.833419 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.833364 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:38:36.833419 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.833378 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:38:36.833419 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.833387 2570 projected.go:194] Error preparing data for projected volume kube-api-access-fd2ft for pod openshift-network-diagnostics/network-check-target-bctwd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:38:36.833562 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:36.833434 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft podName:da67b91f-e17f-4c7a-a45a-dddc62350e0e nodeName:}" failed. No retries permitted until 2026-04-21 04:38:37.833420243 +0000 UTC m=+4.227781960 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fd2ft" (UniqueName: "kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft") pod "network-check-target-bctwd" (UID: "da67b91f-e17f-4c7a-a45a-dddc62350e0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:38:37.076585 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:37.076424 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod065d76ba_0457_43ad_a208_bdd2d77366ce.slice/crio-76714e8c0795d54d205b74d7961d120c553d59578851977bd7762e850e993765 WatchSource:0}: Error finding container 76714e8c0795d54d205b74d7961d120c553d59578851977bd7762e850e993765: Status 404 returned error can't find the container with id 76714e8c0795d54d205b74d7961d120c553d59578851977bd7762e850e993765 Apr 21 04:38:37.078682 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:37.078655 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1b476ba_e89a_4760_8185_d97950e55be1.slice/crio-c298fd12191534f376b3a3cf63c7da63616d16cc56c734fd9d8c01792eadaa29 WatchSource:0}: Error finding container c298fd12191534f376b3a3cf63c7da63616d16cc56c734fd9d8c01792eadaa29: Status 404 returned error can't find the container with id c298fd12191534f376b3a3cf63c7da63616d16cc56c734fd9d8c01792eadaa29 Apr 21 04:38:37.079413 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:37.079387 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80ce6df8_dc91_474a_8b60_0ddac660fee8.slice/crio-b027877548939a1a6ef3e8c988dffa10d32d3a1625910a375ad20e549052fa15 WatchSource:0}: Error finding container b027877548939a1a6ef3e8c988dffa10d32d3a1625910a375ad20e549052fa15: Status 404 returned error can't find the container with id b027877548939a1a6ef3e8c988dffa10d32d3a1625910a375ad20e549052fa15 Apr 21 04:38:37.083846 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:37.083828 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2b122c8_53b3_4280_9f62_b777ac256ac3.slice/crio-b63f0b19a696a2500ff26bcd88aacb59d735c59368163e4df9396755f704ecf3 WatchSource:0}: Error finding container b63f0b19a696a2500ff26bcd88aacb59d735c59368163e4df9396755f704ecf3: Status 404 returned error can't find the container with id b63f0b19a696a2500ff26bcd88aacb59d735c59368163e4df9396755f704ecf3 Apr 21 04:38:37.085578 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:37.085554 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3614312c_c7b0_4801_8358_d1d3c8043ef9.slice/crio-6bb534c1f273bbf963aac37ce878824e3b1af61f561a011e96bb1dbb45215532 WatchSource:0}: Error finding container 6bb534c1f273bbf963aac37ce878824e3b1af61f561a011e96bb1dbb45215532: Status 404 returned error can't find the container with id 6bb534c1f273bbf963aac37ce878824e3b1af61f561a011e96bb1dbb45215532 Apr 21 04:38:37.086715 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:37.086687 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6ff4930_586a_401d_8bf7_787218f408d0.slice/crio-b7bb23b11b0ae0cdb0a89bfd7abc59fd7ef11fc56a967ff73507315edf63b24e WatchSource:0}: Error finding container b7bb23b11b0ae0cdb0a89bfd7abc59fd7ef11fc56a967ff73507315edf63b24e: Status 404 returned error can't find the container with id b7bb23b11b0ae0cdb0a89bfd7abc59fd7ef11fc56a967ff73507315edf63b24e Apr 21 04:38:37.087486 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:37.087457 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6282aefe_100f_4587_93df_5faf16b1e100.slice/crio-037c282f9cf65e55dcfcc2f6625c1b43ebca3084c86281520a0cae1cc524cdef WatchSource:0}: Error finding container 037c282f9cf65e55dcfcc2f6625c1b43ebca3084c86281520a0cae1cc524cdef: Status 404 returned error can't find the container with id 037c282f9cf65e55dcfcc2f6625c1b43ebca3084c86281520a0cae1cc524cdef Apr 21 04:38:37.088151 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:37.087972 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d76811b_93de_4955_b346_ce731491aa8c.slice/crio-3b0ca32e1d39b26c65e9a9e52506b10c5ae840afb0b8416fff22c8a8705e1316 WatchSource:0}: Error finding container 3b0ca32e1d39b26c65e9a9e52506b10c5ae840afb0b8416fff22c8a8705e1316: Status 404 returned error can't find the container with id 3b0ca32e1d39b26c65e9a9e52506b10c5ae840afb0b8416fff22c8a8705e1316 Apr 21 04:38:37.089126 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:38:37.089081 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc70f167b_0eff_4017_9272_7a887e981112.slice/crio-d9d1f228afdfad2900091eb8385ce33e6d101e768290376ae13383cf3a5c21ed WatchSource:0}: Error finding container d9d1f228afdfad2900091eb8385ce33e6d101e768290376ae13383cf3a5c21ed: Status 404 returned error can't find the container with id d9d1f228afdfad2900091eb8385ce33e6d101e768290376ae13383cf3a5c21ed Apr 21 04:38:37.148557 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:37.148414 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 04:33:35 +0000 UTC" deadline="2027-12-10 10:24:42.945075335 +0000 UTC" Apr 21 04:38:37.148557 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:37.148555 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14357h46m5.796523185s" Apr 21 04:38:37.223180 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:37.223152 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" event={"ID":"f1b476ba-e89a-4760-8185-d97950e55be1","Type":"ContainerStarted","Data":"c298fd12191534f376b3a3cf63c7da63616d16cc56c734fd9d8c01792eadaa29"} Apr 21 04:38:37.224728 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:37.224706 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-11.ec2.internal" event={"ID":"b067c9cf5db8c3de32f82b49fa084d46","Type":"ContainerStarted","Data":"f1226ee5966762f0ec4ad40304e47b5624d2b17863c75f2cef8e0723d1699c33"} Apr 21 04:38:37.225589 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:37.225572 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m6sh5" event={"ID":"c70f167b-0eff-4017-9272-7a887e981112","Type":"ContainerStarted","Data":"d9d1f228afdfad2900091eb8385ce33e6d101e768290376ae13383cf3a5c21ed"} Apr 21 04:38:37.226533 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:37.226515 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" event={"ID":"065d76ba-0457-43ad-a208-bdd2d77366ce","Type":"ContainerStarted","Data":"76714e8c0795d54d205b74d7961d120c553d59578851977bd7762e850e993765"} Apr 21 04:38:37.227407 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:37.227387 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7cpf7" event={"ID":"c6ff4930-586a-401d-8bf7-787218f408d0","Type":"ContainerStarted","Data":"b7bb23b11b0ae0cdb0a89bfd7abc59fd7ef11fc56a967ff73507315edf63b24e"} Apr 21 04:38:37.228346 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:37.228328 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dw9qb" event={"ID":"6282aefe-100f-4587-93df-5faf16b1e100","Type":"ContainerStarted","Data":"037c282f9cf65e55dcfcc2f6625c1b43ebca3084c86281520a0cae1cc524cdef"} Apr 21 04:38:37.229181 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:37.229162 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jqrlr" event={"ID":"3d76811b-93de-4955-b346-ce731491aa8c","Type":"ContainerStarted","Data":"3b0ca32e1d39b26c65e9a9e52506b10c5ae840afb0b8416fff22c8a8705e1316"} Apr 21 04:38:37.229957 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:37.229940 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5kp22" event={"ID":"3614312c-c7b0-4801-8358-d1d3c8043ef9","Type":"ContainerStarted","Data":"6bb534c1f273bbf963aac37ce878824e3b1af61f561a011e96bb1dbb45215532"} Apr 21 04:38:37.230762 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:37.230743 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j8dxf" event={"ID":"a2b122c8-53b3-4280-9f62-b777ac256ac3","Type":"ContainerStarted","Data":"b63f0b19a696a2500ff26bcd88aacb59d735c59368163e4df9396755f704ecf3"} Apr 21 04:38:37.231526 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:37.231485 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" event={"ID":"80ce6df8-dc91-474a-8b60-0ddac660fee8","Type":"ContainerStarted","Data":"b027877548939a1a6ef3e8c988dffa10d32d3a1625910a375ad20e549052fa15"} Apr 21 04:38:37.237642 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:37.237593 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-11.ec2.internal" podStartSLOduration=2.237583332 podStartE2EDuration="2.237583332s" podCreationTimestamp="2026-04-21 04:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:38:37.237219862 +0000 UTC m=+3.631581601" watchObservedRunningTime="2026-04-21 04:38:37.237583332 +0000 UTC m=+3.631945071" Apr 21 04:38:37.740450 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:37.740386 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret\") pod \"global-pull-secret-syncer-2wxxz\" (UID: \"a8a94be7-29d2-46f0-af2a-5a46e5fe8810\") " pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:37.740450 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:37.740441 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs\") pod \"network-metrics-daemon-c478k\" (UID: \"1500cffd-5994-4d2a-bd36-855f9cf3efe5\") " pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:37.740757 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:37.740593 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:38:37.740757 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:37.740654 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs podName:1500cffd-5994-4d2a-bd36-855f9cf3efe5 nodeName:}" failed. No retries permitted until 2026-04-21 04:38:39.740635558 +0000 UTC m=+6.134997282 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs") pod "network-metrics-daemon-c478k" (UID: "1500cffd-5994-4d2a-bd36-855f9cf3efe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:38:37.741143 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:37.741002 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:38:37.741143 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:37.741071 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret podName:a8a94be7-29d2-46f0-af2a-5a46e5fe8810 nodeName:}" failed. No retries permitted until 2026-04-21 04:38:39.741053299 +0000 UTC m=+6.135415036 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret") pod "global-pull-secret-syncer-2wxxz" (UID: "a8a94be7-29d2-46f0-af2a-5a46e5fe8810") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:38:37.841464 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:37.840811 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2ft\" (UniqueName: \"kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft\") pod \"network-check-target-bctwd\" (UID: \"da67b91f-e17f-4c7a-a45a-dddc62350e0e\") " pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:37.841464 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:37.841017 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:38:37.841464 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:37.841035 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:38:37.841464 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:37.841060 2570 projected.go:194] Error preparing data for projected volume kube-api-access-fd2ft for pod openshift-network-diagnostics/network-check-target-bctwd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:38:37.841464 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:37.841118 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft podName:da67b91f-e17f-4c7a-a45a-dddc62350e0e nodeName:}" failed. No retries permitted until 2026-04-21 04:38:39.841099643 +0000 UTC m=+6.235461375 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fd2ft" (UniqueName: "kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft") pod "network-check-target-bctwd" (UID: "da67b91f-e17f-4c7a-a45a-dddc62350e0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:38:38.218818 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:38.217783 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:38.218818 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:38.217945 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wxxz" podUID="a8a94be7-29d2-46f0-af2a-5a46e5fe8810" Apr 21 04:38:38.218818 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:38.218449 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:38.218818 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:38.218570 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:38:38.218818 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:38.218647 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:38.218818 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:38.218722 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bctwd" podUID="da67b91f-e17f-4c7a-a45a-dddc62350e0e" Apr 21 04:38:39.261150 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:39.260147 2570 generic.go:358] "Generic (PLEG): container finished" podID="2ef1d1c459c108254ec652c02cdc8b3f" containerID="1aa964ed0b0fdeed9d3c26a6a642cb4692ee8ccf2fc025618e58189daec52e19" exitCode=0 Apr 21 04:38:39.261150 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:39.260204 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal" event={"ID":"2ef1d1c459c108254ec652c02cdc8b3f","Type":"ContainerDied","Data":"1aa964ed0b0fdeed9d3c26a6a642cb4692ee8ccf2fc025618e58189daec52e19"} Apr 21 04:38:39.757452 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:39.757415 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret\") pod \"global-pull-secret-syncer-2wxxz\" (UID: \"a8a94be7-29d2-46f0-af2a-5a46e5fe8810\") " pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:39.757634 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:39.757468 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs\") pod \"network-metrics-daemon-c478k\" (UID: \"1500cffd-5994-4d2a-bd36-855f9cf3efe5\") " pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:39.757634 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:39.757600 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:38:39.757748 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:39.757666 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs podName:1500cffd-5994-4d2a-bd36-855f9cf3efe5 nodeName:}" failed. No retries permitted until 2026-04-21 04:38:43.757644719 +0000 UTC m=+10.152006449 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs") pod "network-metrics-daemon-c478k" (UID: "1500cffd-5994-4d2a-bd36-855f9cf3efe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:38:39.757748 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:39.757601 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:38:39.757748 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:39.757739 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret podName:a8a94be7-29d2-46f0-af2a-5a46e5fe8810 nodeName:}" failed. No retries permitted until 2026-04-21 04:38:43.757719555 +0000 UTC m=+10.152081287 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret") pod "global-pull-secret-syncer-2wxxz" (UID: "a8a94be7-29d2-46f0-af2a-5a46e5fe8810") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:38:39.859201 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:39.858600 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2ft\" (UniqueName: \"kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft\") pod \"network-check-target-bctwd\" (UID: \"da67b91f-e17f-4c7a-a45a-dddc62350e0e\") " pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:39.859201 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:39.858785 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:38:39.859201 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:39.858803 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:38:39.859201 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:39.858816 2570 projected.go:194] Error preparing data for projected volume kube-api-access-fd2ft for pod openshift-network-diagnostics/network-check-target-bctwd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:38:39.859201 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:39.858870 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft podName:da67b91f-e17f-4c7a-a45a-dddc62350e0e nodeName:}" failed. No retries permitted until 2026-04-21 04:38:43.85885321 +0000 UTC m=+10.253214945 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fd2ft" (UniqueName: "kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft") pod "network-check-target-bctwd" (UID: "da67b91f-e17f-4c7a-a45a-dddc62350e0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:38:40.216395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:40.215712 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:40.216395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:40.215735 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:40.216395 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:40.215846 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:38:40.216395 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:40.216241 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wxxz" podUID="a8a94be7-29d2-46f0-af2a-5a46e5fe8810" Apr 21 04:38:40.216395 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:40.216285 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:40.216395 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:40.216352 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bctwd" podUID="da67b91f-e17f-4c7a-a45a-dddc62350e0e" Apr 21 04:38:42.216413 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:42.215643 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:42.216413 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:42.215668 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:42.216413 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:42.215643 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:42.216413 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:42.215809 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bctwd" podUID="da67b91f-e17f-4c7a-a45a-dddc62350e0e" Apr 21 04:38:42.216413 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:42.216290 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wxxz" podUID="a8a94be7-29d2-46f0-af2a-5a46e5fe8810" Apr 21 04:38:42.216413 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:42.216377 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:38:43.789839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:43.789791 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret\") pod \"global-pull-secret-syncer-2wxxz\" (UID: \"a8a94be7-29d2-46f0-af2a-5a46e5fe8810\") " pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:43.789839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:43.789841 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs\") pod \"network-metrics-daemon-c478k\" (UID: \"1500cffd-5994-4d2a-bd36-855f9cf3efe5\") " pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:43.790350 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:43.789937 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:38:43.790350 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:43.789937 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:38:43.790350 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:43.790004 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs podName:1500cffd-5994-4d2a-bd36-855f9cf3efe5 nodeName:}" failed. No retries permitted until 2026-04-21 04:38:51.789982514 +0000 UTC m=+18.184344231 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs") pod "network-metrics-daemon-c478k" (UID: "1500cffd-5994-4d2a-bd36-855f9cf3efe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:38:43.790350 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:43.790079 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret podName:a8a94be7-29d2-46f0-af2a-5a46e5fe8810 nodeName:}" failed. No retries permitted until 2026-04-21 04:38:51.790060698 +0000 UTC m=+18.184422419 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret") pod "global-pull-secret-syncer-2wxxz" (UID: "a8a94be7-29d2-46f0-af2a-5a46e5fe8810") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:38:43.890246 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:43.890208 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2ft\" (UniqueName: \"kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft\") pod \"network-check-target-bctwd\" (UID: \"da67b91f-e17f-4c7a-a45a-dddc62350e0e\") " pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:43.890418 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:43.890392 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:38:43.890418 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:43.890416 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:38:43.890569 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:43.890430 2570 projected.go:194] Error preparing data for projected volume kube-api-access-fd2ft for pod openshift-network-diagnostics/network-check-target-bctwd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:38:43.890569 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:43.890525 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft podName:da67b91f-e17f-4c7a-a45a-dddc62350e0e nodeName:}" failed. No retries permitted until 2026-04-21 04:38:51.890490287 +0000 UTC m=+18.284852022 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fd2ft" (UniqueName: "kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft") pod "network-check-target-bctwd" (UID: "da67b91f-e17f-4c7a-a45a-dddc62350e0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:38:44.217037 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:44.216524 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:44.217037 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:44.216644 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:38:44.217037 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:44.216737 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:44.217037 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:44.216838 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wxxz" podUID="a8a94be7-29d2-46f0-af2a-5a46e5fe8810" Apr 21 04:38:44.217037 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:44.216882 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:44.217037 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:44.216943 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bctwd" podUID="da67b91f-e17f-4c7a-a45a-dddc62350e0e" Apr 21 04:38:46.215216 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:46.215180 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:46.215661 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:46.215327 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:38:46.215661 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:46.215180 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:46.215661 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:46.215419 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bctwd" podUID="da67b91f-e17f-4c7a-a45a-dddc62350e0e" Apr 21 04:38:46.215661 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:46.215180 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:46.215661 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:46.215517 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wxxz" podUID="a8a94be7-29d2-46f0-af2a-5a46e5fe8810" Apr 21 04:38:48.215731 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:48.215689 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:48.216183 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:48.215689 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:48.216183 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:48.215809 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wxxz" podUID="a8a94be7-29d2-46f0-af2a-5a46e5fe8810" Apr 21 04:38:48.216183 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:48.215694 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:48.216183 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:48.215943 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bctwd" podUID="da67b91f-e17f-4c7a-a45a-dddc62350e0e" Apr 21 04:38:48.216183 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:48.216049 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:38:50.215369 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:50.215271 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:50.215369 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:50.215271 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:50.215369 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:50.215331 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:50.215885 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:50.215428 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bctwd" podUID="da67b91f-e17f-4c7a-a45a-dddc62350e0e" Apr 21 04:38:50.215885 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:50.215511 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wxxz" podUID="a8a94be7-29d2-46f0-af2a-5a46e5fe8810" Apr 21 04:38:50.215885 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:50.215576 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:38:51.854847 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:51.854806 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs\") pod \"network-metrics-daemon-c478k\" (UID: \"1500cffd-5994-4d2a-bd36-855f9cf3efe5\") " pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:51.855329 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:51.854907 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret\") pod \"global-pull-secret-syncer-2wxxz\" (UID: \"a8a94be7-29d2-46f0-af2a-5a46e5fe8810\") " pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:51.855329 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:51.854973 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:38:51.855329 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:51.855012 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:38:51.855329 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:51.855047 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs podName:1500cffd-5994-4d2a-bd36-855f9cf3efe5 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:07.855026461 +0000 UTC m=+34.249388179 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs") pod "network-metrics-daemon-c478k" (UID: "1500cffd-5994-4d2a-bd36-855f9cf3efe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:38:51.855329 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:51.855068 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret podName:a8a94be7-29d2-46f0-af2a-5a46e5fe8810 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:07.855057552 +0000 UTC m=+34.249419275 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret") pod "global-pull-secret-syncer-2wxxz" (UID: "a8a94be7-29d2-46f0-af2a-5a46e5fe8810") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:38:51.956017 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:51.955972 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2ft\" (UniqueName: \"kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft\") pod \"network-check-target-bctwd\" (UID: \"da67b91f-e17f-4c7a-a45a-dddc62350e0e\") " pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:51.956163 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:51.956144 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:38:51.956228 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:51.956164 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:38:51.956228 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:51.956174 2570 projected.go:194] Error preparing data for projected volume kube-api-access-fd2ft for pod openshift-network-diagnostics/network-check-target-bctwd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:38:51.956310 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:51.956229 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft podName:da67b91f-e17f-4c7a-a45a-dddc62350e0e nodeName:}" failed. No retries permitted until 2026-04-21 04:39:07.956209326 +0000 UTC m=+34.350571048 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fd2ft" (UniqueName: "kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft") pod "network-check-target-bctwd" (UID: "da67b91f-e17f-4c7a-a45a-dddc62350e0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:38:52.215708 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:52.215621 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:52.215867 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:52.215621 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:52.215867 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:52.215739 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bctwd" podUID="da67b91f-e17f-4c7a-a45a-dddc62350e0e" Apr 21 04:38:52.215867 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:52.215788 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wxxz" podUID="a8a94be7-29d2-46f0-af2a-5a46e5fe8810" Apr 21 04:38:52.215867 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:52.215621 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:52.216094 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:52.215878 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:38:54.215858 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:54.215829 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:54.216303 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:54.215935 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bctwd" podUID="da67b91f-e17f-4c7a-a45a-dddc62350e0e" Apr 21 04:38:54.216303 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:54.216041 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:54.216303 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:54.216166 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:38:54.216303 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:54.216197 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:54.216303 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:54.216265 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wxxz" podUID="a8a94be7-29d2-46f0-af2a-5a46e5fe8810" Apr 21 04:38:55.288807 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.288548 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal" event={"ID":"2ef1d1c459c108254ec652c02cdc8b3f","Type":"ContainerStarted","Data":"d3925bdd5a744f1343146ae14dd688ccd26b18c6e8e20649b7e605fe3d201462"} Apr 21 04:38:55.290127 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.290103 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7cpf7" event={"ID":"c6ff4930-586a-401d-8bf7-787218f408d0","Type":"ContainerStarted","Data":"e41ad282908a3ba47f70d215fa82aee92c941c40c682f73bc7b69e25666c130c"} Apr 21 04:38:55.291536 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.291510 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dw9qb" event={"ID":"6282aefe-100f-4587-93df-5faf16b1e100","Type":"ContainerStarted","Data":"80045347c3244eb7bcac567458b0cf9f59739f94a205ac3533c10e4c9b4d8ef4"} Apr 21 04:38:55.292984 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.292959 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jqrlr" event={"ID":"3d76811b-93de-4955-b346-ce731491aa8c","Type":"ContainerStarted","Data":"d69ab90a93a95da2d6b30330030abfffbd487a0d2c7c211e5d02eebd9aa8a6a6"} Apr 21 04:38:55.294556 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.294530 2570 generic.go:358] "Generic (PLEG): container finished" podID="a2b122c8-53b3-4280-9f62-b777ac256ac3" containerID="bf79359c33550a927f0ed6303aac8f635b8ab0f268a131ad09f4e888bf7c0033" exitCode=0 Apr 21 04:38:55.294647 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.294603 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j8dxf" event={"ID":"a2b122c8-53b3-4280-9f62-b777ac256ac3","Type":"ContainerDied","Data":"bf79359c33550a927f0ed6303aac8f635b8ab0f268a131ad09f4e888bf7c0033"} Apr 21 04:38:55.296186 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.296164 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" event={"ID":"80ce6df8-dc91-474a-8b60-0ddac660fee8","Type":"ContainerStarted","Data":"95d436be9890fdd383d10b5fe48e02aff325675dd3931f1c2138252558f6c438"} Apr 21 04:38:55.297857 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.297614 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" event={"ID":"f1b476ba-e89a-4760-8185-d97950e55be1","Type":"ContainerStarted","Data":"440e04120018d7381bf2d18fe4e2e49199acdf82f2c9c93478a1a1b6ec10b25e"} Apr 21 04:38:55.298986 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.298961 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m6sh5" event={"ID":"c70f167b-0eff-4017-9272-7a887e981112","Type":"ContainerStarted","Data":"8588c458fce1d68fef93a3170b588853e00a7dc263caf0317015cc4b81a5fd57"} Apr 21 04:38:55.301235 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.301200 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-11.ec2.internal" podStartSLOduration=20.301186429 podStartE2EDuration="20.301186429s" podCreationTimestamp="2026-04-21 04:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:38:55.300691524 +0000 UTC m=+21.695053265" watchObservedRunningTime="2026-04-21 04:38:55.301186429 +0000 UTC m=+21.695548171" Apr 21 04:38:55.302358 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.302340 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" event={"ID":"065d76ba-0457-43ad-a208-bdd2d77366ce","Type":"ContainerStarted","Data":"dae9434f37add5df5a62e49455fe816469e98ef3e55cf29ccc7ef9434e1cf725"} Apr 21 04:38:55.302449 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.302360 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" event={"ID":"065d76ba-0457-43ad-a208-bdd2d77366ce","Type":"ContainerStarted","Data":"52eea60cd47a466e3dac183c6ee6248ddee788e5ad44dabb77bb721fdaf6b743"} Apr 21 04:38:55.302449 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.302373 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" event={"ID":"065d76ba-0457-43ad-a208-bdd2d77366ce","Type":"ContainerStarted","Data":"e9d8a06efdf26ccbcd31378304955f2965ef94ff88fa3ed9e5ede2b3ebacd966"} Apr 21 04:38:55.302449 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.302387 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" event={"ID":"065d76ba-0457-43ad-a208-bdd2d77366ce","Type":"ContainerStarted","Data":"df89576306c734867c7ec0c2748ddc2036645b879197d7202aba4382f641f2b1"} Apr 21 04:38:55.302449 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.302402 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" event={"ID":"065d76ba-0457-43ad-a208-bdd2d77366ce","Type":"ContainerStarted","Data":"64fed1ede1db35b0154565e63b55de52b639311f9dd168438f7f3e26d173f14a"} Apr 21 04:38:55.302449 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.302413 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" event={"ID":"065d76ba-0457-43ad-a208-bdd2d77366ce","Type":"ContainerStarted","Data":"5e5910df57816449d1a1f531b69858bbb5d922cf5a04f09d5111140ae5f9605a"} Apr 21 04:38:55.312846 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.312810 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-kqvp7" podStartSLOduration=4.071766737 podStartE2EDuration="21.312800375s" podCreationTimestamp="2026-04-21 04:38:34 +0000 UTC" firstStartedPulling="2026-04-21 04:38:37.082403828 +0000 UTC m=+3.476765555" lastFinishedPulling="2026-04-21 04:38:54.323437461 +0000 UTC m=+20.717799193" observedRunningTime="2026-04-21 04:38:55.31246119 +0000 UTC m=+21.706822934" watchObservedRunningTime="2026-04-21 04:38:55.312800375 +0000 UTC m=+21.707162114" Apr 21 04:38:55.338100 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.338059 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7cpf7" podStartSLOduration=4.087959843 podStartE2EDuration="21.338048604s" podCreationTimestamp="2026-04-21 04:38:34 +0000 UTC" firstStartedPulling="2026-04-21 04:38:37.088378169 +0000 UTC m=+3.482739894" lastFinishedPulling="2026-04-21 04:38:54.338466932 +0000 UTC m=+20.732828655" observedRunningTime="2026-04-21 04:38:55.337982769 +0000 UTC m=+21.732344509" watchObservedRunningTime="2026-04-21 04:38:55.338048604 +0000 UTC m=+21.732410390" Apr 21 04:38:55.363094 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.363048 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-m6sh5" podStartSLOduration=4.130767273 podStartE2EDuration="21.363033647s" podCreationTimestamp="2026-04-21 04:38:34 +0000 UTC" firstStartedPulling="2026-04-21 04:38:37.091169764 +0000 UTC m=+3.485531483" lastFinishedPulling="2026-04-21 04:38:54.323436114 +0000 UTC m=+20.717797857" observedRunningTime="2026-04-21 04:38:55.362676811 +0000 UTC m=+21.757038596" watchObservedRunningTime="2026-04-21 04:38:55.363033647 +0000 UTC m=+21.757395390" Apr 21 04:38:55.379578 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.379544 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jqrlr" podStartSLOduration=4.111540846 podStartE2EDuration="21.379533829s" podCreationTimestamp="2026-04-21 04:38:34 +0000 UTC" firstStartedPulling="2026-04-21 04:38:37.090464581 +0000 UTC m=+3.484826309" lastFinishedPulling="2026-04-21 04:38:54.358457574 +0000 UTC m=+20.752819292" observedRunningTime="2026-04-21 04:38:55.379391941 +0000 UTC m=+21.773753681" watchObservedRunningTime="2026-04-21 04:38:55.379533829 +0000 UTC m=+21.773895569" Apr 21 04:38:55.391331 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.391300 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-dw9qb" podStartSLOduration=4.158043644 podStartE2EDuration="21.391290664s" podCreationTimestamp="2026-04-21 04:38:34 +0000 UTC" firstStartedPulling="2026-04-21 04:38:37.090280021 +0000 UTC m=+3.484641739" lastFinishedPulling="2026-04-21 04:38:54.323527028 +0000 UTC m=+20.717888759" observedRunningTime="2026-04-21 04:38:55.391286954 +0000 UTC m=+21.785648694" watchObservedRunningTime="2026-04-21 04:38:55.391290664 +0000 UTC m=+21.785652405" Apr 21 04:38:55.632106 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.632086 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 04:38:55.885762 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.885732 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-dw9qb" Apr 21 04:38:55.886388 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:55.886368 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-dw9qb" Apr 21 04:38:56.186043 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:56.185886 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T04:38:55.632102935Z","UUID":"fb7d5c83-ad42-409b-9b01-98bd7f176676","Handler":null,"Name":"","Endpoint":""} Apr 21 04:38:56.188552 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:56.188511 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 04:38:56.188552 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:56.188541 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 04:38:56.219543 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:56.219517 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:56.219714 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:56.219517 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:56.219714 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:56.219638 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:38:56.219836 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:56.219721 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wxxz" podUID="a8a94be7-29d2-46f0-af2a-5a46e5fe8810" Apr 21 04:38:56.219836 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:56.219521 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:56.219955 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:56.219864 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bctwd" podUID="da67b91f-e17f-4c7a-a45a-dddc62350e0e" Apr 21 04:38:56.305905 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:56.305869 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5kp22" event={"ID":"3614312c-c7b0-4801-8358-d1d3c8043ef9","Type":"ContainerStarted","Data":"ddaaab0c8a84c84b75f079ce0e953a308ccda732e18790074c254f18a8604805"} Apr 21 04:38:56.307734 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:56.307704 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" event={"ID":"80ce6df8-dc91-474a-8b60-0ddac660fee8","Type":"ContainerStarted","Data":"a5b6e43914d87a20252e6823bea791850e9b9a40ae8c7ff40e8add69964697e9"} Apr 21 04:38:56.331438 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:56.331391 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5kp22" podStartSLOduration=5.077093221 podStartE2EDuration="22.331373822s" podCreationTimestamp="2026-04-21 04:38:34 +0000 UTC" firstStartedPulling="2026-04-21 04:38:37.087533555 +0000 UTC m=+3.481895273" lastFinishedPulling="2026-04-21 04:38:54.341814154 +0000 UTC m=+20.736175874" observedRunningTime="2026-04-21 04:38:56.33072826 +0000 UTC m=+22.725090001" watchObservedRunningTime="2026-04-21 04:38:56.331373822 +0000 UTC m=+22.725735562" Apr 21 04:38:57.311036 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:57.310938 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" event={"ID":"80ce6df8-dc91-474a-8b60-0ddac660fee8","Type":"ContainerStarted","Data":"c150d9c79e6b49200a9d3f6dcbcfea5fe9649456396e580b0c717c57db057281"} Apr 21 04:38:57.314514 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:57.314447 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" event={"ID":"065d76ba-0457-43ad-a208-bdd2d77366ce","Type":"ContainerStarted","Data":"1c4d7c66c47e328a9c2d3446a233ee4f450e9c1f24f045dd9616f4369e3bc732"} Apr 21 04:38:57.314658 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:57.314541 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 04:38:57.327742 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:57.327699 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sj7wf" podStartSLOduration=3.921675578 podStartE2EDuration="23.327688377s" podCreationTimestamp="2026-04-21 04:38:34 +0000 UTC" firstStartedPulling="2026-04-21 04:38:37.083443435 +0000 UTC m=+3.477805156" lastFinishedPulling="2026-04-21 04:38:56.48945622 +0000 UTC m=+22.883817955" observedRunningTime="2026-04-21 04:38:57.327140231 +0000 UTC m=+23.721501971" watchObservedRunningTime="2026-04-21 04:38:57.327688377 +0000 UTC m=+23.722050117" Apr 21 04:38:58.215728 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:58.215490 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:38:58.215908 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:58.215491 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:38:58.215908 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:38:58.215517 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:38:58.215908 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:58.215847 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:38:58.216057 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:58.215921 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wxxz" podUID="a8a94be7-29d2-46f0-af2a-5a46e5fe8810" Apr 21 04:38:58.216057 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:38:58.216019 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bctwd" podUID="da67b91f-e17f-4c7a-a45a-dddc62350e0e" Apr 21 04:39:00.215581 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:00.215385 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:39:00.216487 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:00.215390 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:39:00.216487 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:00.215642 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wxxz" podUID="a8a94be7-29d2-46f0-af2a-5a46e5fe8810" Apr 21 04:39:00.216487 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:00.215444 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:39:00.216487 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:00.215723 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:39:00.216487 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:00.215797 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bctwd" podUID="da67b91f-e17f-4c7a-a45a-dddc62350e0e" Apr 21 04:39:00.321481 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:00.321450 2570 generic.go:358] "Generic (PLEG): container finished" podID="a2b122c8-53b3-4280-9f62-b777ac256ac3" containerID="14e778d8ce8a274911a18884f07fa05a940bb404ab674a534e253bd196e86c99" exitCode=0 Apr 21 04:39:00.321658 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:00.321526 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j8dxf" event={"ID":"a2b122c8-53b3-4280-9f62-b777ac256ac3","Type":"ContainerDied","Data":"14e778d8ce8a274911a18884f07fa05a940bb404ab674a534e253bd196e86c99"} Apr 21 04:39:00.324735 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:00.324708 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" event={"ID":"065d76ba-0457-43ad-a208-bdd2d77366ce","Type":"ContainerStarted","Data":"e08f8523b75c423edb1442a361690b1b9ad2348bc7981e7381e65be45d19ea0e"} Apr 21 04:39:00.325010 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:00.324996 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:39:00.325085 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:00.325014 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:39:00.339310 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:00.339288 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:39:00.366030 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:00.365991 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" podStartSLOduration=8.900017812 podStartE2EDuration="26.365980183s" podCreationTimestamp="2026-04-21 04:38:34 +0000 UTC" firstStartedPulling="2026-04-21 04:38:37.078548678 +0000 UTC m=+3.472910397" lastFinishedPulling="2026-04-21 04:38:54.544511036 +0000 UTC m=+20.938872768" observedRunningTime="2026-04-21 04:39:00.365667152 +0000 UTC m=+26.760028902" watchObservedRunningTime="2026-04-21 04:39:00.365980183 +0000 UTC m=+26.760341923" Apr 21 04:39:01.327300 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:01.327274 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:39:01.344768 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:01.344741 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:39:01.561415 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:01.561386 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bctwd"] Apr 21 04:39:01.561566 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:01.561526 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:39:01.561624 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:01.561607 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bctwd" podUID="da67b91f-e17f-4c7a-a45a-dddc62350e0e" Apr 21 04:39:01.564062 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:01.564038 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c478k"] Apr 21 04:39:01.564179 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:01.564132 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:39:01.564225 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:01.564204 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:39:01.566620 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:01.566570 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2wxxz"] Apr 21 04:39:01.566688 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:01.566660 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:39:01.566740 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:01.566723 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wxxz" podUID="a8a94be7-29d2-46f0-af2a-5a46e5fe8810" Apr 21 04:39:02.330366 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:02.330331 2570 generic.go:358] "Generic (PLEG): container finished" podID="a2b122c8-53b3-4280-9f62-b777ac256ac3" containerID="02c2adb96d58a810b1a01786baa65a4f7e6ba87e23bc76466183c9b1036b0cc8" exitCode=0 Apr 21 04:39:02.330843 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:02.330407 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j8dxf" event={"ID":"a2b122c8-53b3-4280-9f62-b777ac256ac3","Type":"ContainerDied","Data":"02c2adb96d58a810b1a01786baa65a4f7e6ba87e23bc76466183c9b1036b0cc8"} Apr 21 04:39:02.648060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:02.647971 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-dw9qb" Apr 21 04:39:02.648205 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:02.648101 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 04:39:02.648591 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:02.648570 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-dw9qb" Apr 21 04:39:03.215081 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:03.215057 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:39:03.215218 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:03.215085 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:39:03.215218 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:03.215175 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wxxz" podUID="a8a94be7-29d2-46f0-af2a-5a46e5fe8810" Apr 21 04:39:03.215218 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:03.215188 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:39:03.215366 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:03.215312 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:39:03.215421 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:03.215392 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bctwd" podUID="da67b91f-e17f-4c7a-a45a-dddc62350e0e" Apr 21 04:39:03.334433 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:03.334400 2570 generic.go:358] "Generic (PLEG): container finished" podID="a2b122c8-53b3-4280-9f62-b777ac256ac3" containerID="07ada84b73802c46a24754de665513e39f1a6132afd5fc6bf108ba3790ba3cc2" exitCode=0 Apr 21 04:39:03.334867 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:03.334487 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j8dxf" event={"ID":"a2b122c8-53b3-4280-9f62-b777ac256ac3","Type":"ContainerDied","Data":"07ada84b73802c46a24754de665513e39f1a6132afd5fc6bf108ba3790ba3cc2"} Apr 21 04:39:05.215445 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:05.215242 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:39:05.215926 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:05.215254 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:39:05.215926 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:05.215555 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bctwd" podUID="da67b91f-e17f-4c7a-a45a-dddc62350e0e" Apr 21 04:39:05.215926 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:05.215625 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wxxz" podUID="a8a94be7-29d2-46f0-af2a-5a46e5fe8810" Apr 21 04:39:05.215926 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:05.215242 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:39:05.215926 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:05.215738 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:39:07.215376 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.215341 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:39:07.215840 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.215341 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:39:07.215840 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.215476 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2wxxz" podUID="a8a94be7-29d2-46f0-af2a-5a46e5fe8810" Apr 21 04:39:07.215840 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.215341 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:39:07.215840 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.215574 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:39:07.215840 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.215628 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bctwd" podUID="da67b91f-e17f-4c7a-a45a-dddc62350e0e" Apr 21 04:39:07.422187 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.422110 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-11.ec2.internal" event="NodeReady" Apr 21 04:39:07.422355 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.422256 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 04:39:07.455160 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.455129 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5647cc45cd-54w7k"] Apr 21 04:39:07.488978 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.488771 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f9b79b6d9-276jh"] Apr 21 04:39:07.488978 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.488926 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.491870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.491844 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 04:39:07.491870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.491866 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nlhzm\"" Apr 21 04:39:07.492142 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.491869 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 04:39:07.492142 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.492020 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 04:39:07.502088 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.502063 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5"] Apr 21 04:39:07.502214 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.502197 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f9b79b6d9-276jh" Apr 21 04:39:07.504870 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.504844 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-m7t55\"" Apr 21 04:39:07.504979 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.504807 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 04:39:07.504979 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.504960 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 04:39:07.505090 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.504853 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 04:39:07.505208 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.505184 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 04:39:07.513646 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.513627 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc"] Apr 21 04:39:07.513782 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.513763 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" Apr 21 04:39:07.514143 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.514120 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 04:39:07.515866 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.515844 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 04:39:07.536297 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.536277 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-m4stv"] Apr 21 04:39:07.536455 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.536441 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.538903 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.538878 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 04:39:07.538903 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.538878 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 04:39:07.539172 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.539158 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 04:39:07.539432 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.539411 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 04:39:07.557338 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.557318 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xwsgf"] Apr 21 04:39:07.557483 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.557460 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:07.559754 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.559735 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 04:39:07.560029 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.560007 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f9n52\"" Apr 21 04:39:07.560374 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.560357 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 04:39:07.572166 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.572147 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5647cc45cd-54w7k"] Apr 21 04:39:07.572271 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.572177 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f9b79b6d9-276jh"] Apr 21 04:39:07.572271 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.572189 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc"] Apr 21 04:39:07.572271 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.572201 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5"] Apr 21 04:39:07.572271 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.572211 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xwsgf"] Apr 21 04:39:07.572271 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.572223 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m4stv"] Apr 21 04:39:07.572514 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.572275 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xwsgf" Apr 21 04:39:07.574797 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.574760 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 04:39:07.574797 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.574776 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 04:39:07.574797 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.574785 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 04:39:07.574988 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.574814 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ljx4n\"" Apr 21 04:39:07.583544 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.583522 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-trusted-ca\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.583623 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.583570 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-ca-trust-extracted\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.583623 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.583597 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-bound-sa-token\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.583698 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.583679 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm7p9\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-kube-api-access-tm7p9\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.583749 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.583711 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-image-registry-private-configuration\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.583749 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.583739 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-certificates\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.583817 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.583786 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-installation-pull-secrets\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.583872 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.583858 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.684285 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684214 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-ca-trust-extracted\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.684285 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684256 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-bound-sa-token\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.684507 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684302 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw5rb\" (UniqueName: \"kubernetes.io/projected/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-kube-api-access-tw5rb\") pod \"ingress-canary-xwsgf\" (UID: \"b0ff8417-568b-49f9-adc4-be1ff4ba8ca5\") " pod="openshift-ingress-canary/ingress-canary-xwsgf" Apr 21 04:39:07.684507 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684361 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpj7k\" (UniqueName: \"kubernetes.io/projected/b64d2b1f-0a86-4237-801c-025099403da9-kube-api-access-mpj7k\") pod \"managed-serviceaccount-addon-agent-5f9b79b6d9-276jh\" (UID: \"b64d2b1f-0a86-4237-801c-025099403da9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f9b79b6d9-276jh" Apr 21 04:39:07.684507 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684457 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tm7p9\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-kube-api-access-tm7p9\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.684648 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684487 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b64d2b1f-0a86-4237-801c-025099403da9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5f9b79b6d9-276jh\" (UID: \"b64d2b1f-0a86-4237-801c-025099403da9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f9b79b6d9-276jh" Apr 21 04:39:07.684648 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684538 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.684742 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684685 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d7b4054-d280-4074-b713-a7fe58a0ee82-config-volume\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:07.684742 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684729 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-image-registry-private-configuration\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.684819 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684746 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-certificates\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.684819 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684769 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-installation-pull-secrets\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.684819 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684786 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvs5q\" (UniqueName: \"kubernetes.io/projected/3d7b4054-d280-4074-b713-a7fe58a0ee82-kube-api-access-wvs5q\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:07.684933 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684827 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.684933 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684861 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ade469fe-d322-489f-8cdf-95e181f955f6-tmp\") pod \"klusterlet-addon-workmgr-9f9459848-qc2s5\" (UID: \"ade469fe-d322-489f-8cdf-95e181f955f6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" Apr 21 04:39:07.684933 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684878 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xnfl\" (UniqueName: \"kubernetes.io/projected/ade469fe-d322-489f-8cdf-95e181f955f6-kube-api-access-2xnfl\") pod \"klusterlet-addon-workmgr-9f9459848-qc2s5\" (UID: \"ade469fe-d322-489f-8cdf-95e181f955f6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" Apr 21 04:39:07.684933 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684906 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.685063 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684931 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.685063 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684955 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d7b4054-d280-4074-b713-a7fe58a0ee82-tmp-dir\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:07.685063 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684973 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert\") pod \"ingress-canary-xwsgf\" (UID: \"b0ff8417-568b-49f9-adc4-be1ff4ba8ca5\") " pod="openshift-ingress-canary/ingress-canary-xwsgf" Apr 21 04:39:07.685063 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.684988 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:07.685063 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.685007 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-ca\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.685063 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.685038 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-trusted-ca\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.685306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.685064 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-hub\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.685306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.685099 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ade469fe-d322-489f-8cdf-95e181f955f6-klusterlet-config\") pod \"klusterlet-addon-workmgr-9f9459848-qc2s5\" (UID: \"ade469fe-d322-489f-8cdf-95e181f955f6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" Apr 21 04:39:07.685306 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.685126 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p6vs\" (UniqueName: \"kubernetes.io/projected/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-kube-api-access-6p6vs\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.685413 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.685404 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:39:07.685462 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.685418 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5647cc45cd-54w7k: secret "image-registry-tls" not found Apr 21 04:39:07.685523 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.685474 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls podName:edbbec20-1f0f-4ad3-a24c-46bfdbc47e17 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:08.185453082 +0000 UTC m=+34.579814801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls") pod "image-registry-5647cc45cd-54w7k" (UID: "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17") : secret "image-registry-tls" not found Apr 21 04:39:07.695816 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.695353 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-ca-trust-extracted\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.695816 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.695785 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-certificates\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.696088 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.696064 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-trusted-ca\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.699835 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.699791 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-installation-pull-secrets\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.699959 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.699832 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-bound-sa-token\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.699959 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.699832 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-image-registry-private-configuration\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.699959 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.699941 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm7p9\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-kube-api-access-tm7p9\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:07.785547 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.785510 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ade469fe-d322-489f-8cdf-95e181f955f6-tmp\") pod \"klusterlet-addon-workmgr-9f9459848-qc2s5\" (UID: \"ade469fe-d322-489f-8cdf-95e181f955f6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" Apr 21 04:39:07.785547 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.785546 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xnfl\" (UniqueName: \"kubernetes.io/projected/ade469fe-d322-489f-8cdf-95e181f955f6-kube-api-access-2xnfl\") pod \"klusterlet-addon-workmgr-9f9459848-qc2s5\" (UID: \"ade469fe-d322-489f-8cdf-95e181f955f6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" Apr 21 04:39:07.785780 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.785658 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.785780 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.785689 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.785780 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.785715 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d7b4054-d280-4074-b713-a7fe58a0ee82-tmp-dir\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:07.785780 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.785743 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert\") pod \"ingress-canary-xwsgf\" (UID: \"b0ff8417-568b-49f9-adc4-be1ff4ba8ca5\") " pod="openshift-ingress-canary/ingress-canary-xwsgf" Apr 21 04:39:07.785780 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.785765 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:07.786024 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.785790 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-ca\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.786024 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.785821 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-hub\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.786024 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.785849 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ade469fe-d322-489f-8cdf-95e181f955f6-klusterlet-config\") pod \"klusterlet-addon-workmgr-9f9459848-qc2s5\" (UID: \"ade469fe-d322-489f-8cdf-95e181f955f6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" Apr 21 04:39:07.786024 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.785875 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6p6vs\" (UniqueName: \"kubernetes.io/projected/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-kube-api-access-6p6vs\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.786024 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.785931 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tw5rb\" (UniqueName: \"kubernetes.io/projected/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-kube-api-access-tw5rb\") pod \"ingress-canary-xwsgf\" (UID: \"b0ff8417-568b-49f9-adc4-be1ff4ba8ca5\") " pod="openshift-ingress-canary/ingress-canary-xwsgf" Apr 21 04:39:07.786024 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.785955 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpj7k\" (UniqueName: \"kubernetes.io/projected/b64d2b1f-0a86-4237-801c-025099403da9-kube-api-access-mpj7k\") pod \"managed-serviceaccount-addon-agent-5f9b79b6d9-276jh\" (UID: \"b64d2b1f-0a86-4237-801c-025099403da9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f9b79b6d9-276jh" Apr 21 04:39:07.786024 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.785969 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ade469fe-d322-489f-8cdf-95e181f955f6-tmp\") pod \"klusterlet-addon-workmgr-9f9459848-qc2s5\" (UID: \"ade469fe-d322-489f-8cdf-95e181f955f6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" Apr 21 04:39:07.786024 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.786011 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b64d2b1f-0a86-4237-801c-025099403da9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5f9b79b6d9-276jh\" (UID: \"b64d2b1f-0a86-4237-801c-025099403da9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f9b79b6d9-276jh" Apr 21 04:39:07.786431 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.786082 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:07.786431 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.786142 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert podName:b0ff8417-568b-49f9-adc4-be1ff4ba8ca5 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:08.28611987 +0000 UTC m=+34.680481603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert") pod "ingress-canary-xwsgf" (UID: "b0ff8417-568b-49f9-adc4-be1ff4ba8ca5") : secret "canary-serving-cert" not found Apr 21 04:39:07.786431 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.786360 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d7b4054-d280-4074-b713-a7fe58a0ee82-tmp-dir\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:07.786431 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.786368 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.786431 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.786421 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d7b4054-d280-4074-b713-a7fe58a0ee82-config-volume\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:07.786710 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.786442 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:07.786710 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.786456 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvs5q\" (UniqueName: \"kubernetes.io/projected/3d7b4054-d280-4074-b713-a7fe58a0ee82-kube-api-access-wvs5q\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:07.786710 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.786521 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls podName:3d7b4054-d280-4074-b713-a7fe58a0ee82 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:08.286481672 +0000 UTC m=+34.680843404 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls") pod "dns-default-m4stv" (UID: "3d7b4054-d280-4074-b713-a7fe58a0ee82") : secret "dns-default-metrics-tls" not found Apr 21 04:39:07.786710 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.786634 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.787259 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.787236 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d7b4054-d280-4074-b713-a7fe58a0ee82-config-volume\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:07.789465 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.789440 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.789669 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.789635 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-ca\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.789762 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.789717 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-hub\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.789863 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.789841 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b64d2b1f-0a86-4237-801c-025099403da9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5f9b79b6d9-276jh\" (UID: \"b64d2b1f-0a86-4237-801c-025099403da9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f9b79b6d9-276jh" Apr 21 04:39:07.789924 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.789872 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.790117 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.790097 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ade469fe-d322-489f-8cdf-95e181f955f6-klusterlet-config\") pod \"klusterlet-addon-workmgr-9f9459848-qc2s5\" (UID: \"ade469fe-d322-489f-8cdf-95e181f955f6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" Apr 21 04:39:07.798419 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.798353 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvs5q\" (UniqueName: \"kubernetes.io/projected/3d7b4054-d280-4074-b713-a7fe58a0ee82-kube-api-access-wvs5q\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:07.799123 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.799098 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xnfl\" (UniqueName: \"kubernetes.io/projected/ade469fe-d322-489f-8cdf-95e181f955f6-kube-api-access-2xnfl\") pod \"klusterlet-addon-workmgr-9f9459848-qc2s5\" (UID: \"ade469fe-d322-489f-8cdf-95e181f955f6\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" Apr 21 04:39:07.799861 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.799838 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p6vs\" (UniqueName: \"kubernetes.io/projected/c7b58ecd-52e3-45cd-9fc2-fc066d5faadc-kube-api-access-6p6vs\") pod \"cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc\" (UID: \"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.800381 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.800357 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw5rb\" (UniqueName: \"kubernetes.io/projected/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-kube-api-access-tw5rb\") pod \"ingress-canary-xwsgf\" (UID: \"b0ff8417-568b-49f9-adc4-be1ff4ba8ca5\") " pod="openshift-ingress-canary/ingress-canary-xwsgf" Apr 21 04:39:07.801558 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.801535 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpj7k\" (UniqueName: \"kubernetes.io/projected/b64d2b1f-0a86-4237-801c-025099403da9-kube-api-access-mpj7k\") pod \"managed-serviceaccount-addon-agent-5f9b79b6d9-276jh\" (UID: \"b64d2b1f-0a86-4237-801c-025099403da9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f9b79b6d9-276jh" Apr 21 04:39:07.818838 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.818812 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f9b79b6d9-276jh" Apr 21 04:39:07.827593 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.827572 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" Apr 21 04:39:07.846289 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.846261 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:39:07.887516 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.887467 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs\") pod \"network-metrics-daemon-c478k\" (UID: \"1500cffd-5994-4d2a-bd36-855f9cf3efe5\") " pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:39:07.887682 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.887623 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret\") pod \"global-pull-secret-syncer-2wxxz\" (UID: \"a8a94be7-29d2-46f0-af2a-5a46e5fe8810\") " pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:39:07.887682 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.887640 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:07.887788 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.887709 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs podName:1500cffd-5994-4d2a-bd36-855f9cf3efe5 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:39.887689697 +0000 UTC m=+66.282051419 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs") pod "network-metrics-daemon-c478k" (UID: "1500cffd-5994-4d2a-bd36-855f9cf3efe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:07.887788 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.887727 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:39:07.887788 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.887778 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret podName:a8a94be7-29d2-46f0-af2a-5a46e5fe8810 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:39.887762819 +0000 UTC m=+66.282124551 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret") pod "global-pull-secret-syncer-2wxxz" (UID: "a8a94be7-29d2-46f0-af2a-5a46e5fe8810") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:39:07.988126 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:07.988039 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2ft\" (UniqueName: \"kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft\") pod \"network-check-target-bctwd\" (UID: \"da67b91f-e17f-4c7a-a45a-dddc62350e0e\") " pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:39:07.988283 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.988222 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:39:07.988283 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.988247 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:39:07.988283 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.988261 2570 projected.go:194] Error preparing data for projected volume kube-api-access-fd2ft for pod openshift-network-diagnostics/network-check-target-bctwd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:07.988400 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:07.988319 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft podName:da67b91f-e17f-4c7a-a45a-dddc62350e0e nodeName:}" failed. No retries permitted until 2026-04-21 04:39:39.988299122 +0000 UTC m=+66.382660846 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-fd2ft" (UniqueName: "kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft") pod "network-check-target-bctwd" (UID: "da67b91f-e17f-4c7a-a45a-dddc62350e0e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:08.190871 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:08.190635 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:08.191050 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:08.190919 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:39:08.191050 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:08.190943 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5647cc45cd-54w7k: secret "image-registry-tls" not found Apr 21 04:39:08.191050 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:08.191013 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls podName:edbbec20-1f0f-4ad3-a24c-46bfdbc47e17 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:09.19098485 +0000 UTC m=+35.585346583 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls") pod "image-registry-5647cc45cd-54w7k" (UID: "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17") : secret "image-registry-tls" not found Apr 21 04:39:08.291508 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:08.291404 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert\") pod \"ingress-canary-xwsgf\" (UID: \"b0ff8417-568b-49f9-adc4-be1ff4ba8ca5\") " pod="openshift-ingress-canary/ingress-canary-xwsgf" Apr 21 04:39:08.291508 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:08.291459 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:08.292208 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:08.291599 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:08.292208 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:08.291665 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:08.292208 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:08.291701 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert podName:b0ff8417-568b-49f9-adc4-be1ff4ba8ca5 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:09.291676383 +0000 UTC m=+35.686038105 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert") pod "ingress-canary-xwsgf" (UID: "b0ff8417-568b-49f9-adc4-be1ff4ba8ca5") : secret "canary-serving-cert" not found Apr 21 04:39:08.292208 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:08.291718 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls podName:3d7b4054-d280-4074-b713-a7fe58a0ee82 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:09.29171021 +0000 UTC m=+35.686071928 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls") pod "dns-default-m4stv" (UID: "3d7b4054-d280-4074-b713-a7fe58a0ee82") : secret "dns-default-metrics-tls" not found Apr 21 04:39:09.202168 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:09.201627 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:09.202168 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:09.201817 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:39:09.202168 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:09.201836 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5647cc45cd-54w7k: secret "image-registry-tls" not found Apr 21 04:39:09.202168 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:09.201904 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls podName:edbbec20-1f0f-4ad3-a24c-46bfdbc47e17 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:11.201881748 +0000 UTC m=+37.596243491 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls") pod "image-registry-5647cc45cd-54w7k" (UID: "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17") : secret "image-registry-tls" not found Apr 21 04:39:09.215984 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:09.215801 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:39:09.216247 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:09.215804 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:39:09.216626 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:09.215808 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:39:09.220230 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:09.218672 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 04:39:09.220230 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:09.218838 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 04:39:09.220230 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:09.218956 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 04:39:09.220230 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:09.219450 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-979g8\"" Apr 21 04:39:09.220230 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:09.219631 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jgxqw\"" Apr 21 04:39:09.220230 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:09.219747 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 04:39:09.279694 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:09.279520 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5"] Apr 21 04:39:09.282513 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:09.282478 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc"] Apr 21 04:39:09.283217 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:09.283183 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f9b79b6d9-276jh"] Apr 21 04:39:09.302923 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:09.302896 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert\") pod \"ingress-canary-xwsgf\" (UID: \"b0ff8417-568b-49f9-adc4-be1ff4ba8ca5\") " pod="openshift-ingress-canary/ingress-canary-xwsgf" Apr 21 04:39:09.303371 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:09.302928 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:09.303371 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:09.303034 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:09.303371 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:09.303036 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:09.303371 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:09.303088 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert podName:b0ff8417-568b-49f9-adc4-be1ff4ba8ca5 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:11.303072313 +0000 UTC m=+37.697434030 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert") pod "ingress-canary-xwsgf" (UID: "b0ff8417-568b-49f9-adc4-be1ff4ba8ca5") : secret "canary-serving-cert" not found Apr 21 04:39:09.303371 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:09.303110 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls podName:3d7b4054-d280-4074-b713-a7fe58a0ee82 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:11.303098336 +0000 UTC m=+37.697460054 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls") pod "dns-default-m4stv" (UID: "3d7b4054-d280-4074-b713-a7fe58a0ee82") : secret "dns-default-metrics-tls" not found Apr 21 04:39:09.363269 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:39:09.363228 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podade469fe_d322_489f_8cdf_95e181f955f6.slice/crio-7ecdbd4c301001e827853b551bf10c4cf9c785550d55d9ddd8b6c87951571d0e WatchSource:0}: Error finding container 7ecdbd4c301001e827853b551bf10c4cf9c785550d55d9ddd8b6c87951571d0e: Status 404 returned error can't find the container with id 7ecdbd4c301001e827853b551bf10c4cf9c785550d55d9ddd8b6c87951571d0e Apr 21 04:39:09.363983 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:39:09.363863 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7b58ecd_52e3_45cd_9fc2_fc066d5faadc.slice/crio-9b3aad92e2dbdbef329c41db3ba65006c59c2796c64136bb6c54043aa468ebde WatchSource:0}: Error finding container 9b3aad92e2dbdbef329c41db3ba65006c59c2796c64136bb6c54043aa468ebde: Status 404 returned error can't find the container with id 9b3aad92e2dbdbef329c41db3ba65006c59c2796c64136bb6c54043aa468ebde Apr 21 04:39:09.364578 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:39:09.364488 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb64d2b1f_0a86_4237_801c_025099403da9.slice/crio-d646bec142271a2b04dcd81f291c356f7a25427bc23cc313b291cd93fdc6ab91 WatchSource:0}: Error finding container d646bec142271a2b04dcd81f291c356f7a25427bc23cc313b291cd93fdc6ab91: Status 404 returned error can't find the container with id d646bec142271a2b04dcd81f291c356f7a25427bc23cc313b291cd93fdc6ab91 Apr 21 04:39:10.353244 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:10.353172 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f9b79b6d9-276jh" event={"ID":"b64d2b1f-0a86-4237-801c-025099403da9","Type":"ContainerStarted","Data":"d646bec142271a2b04dcd81f291c356f7a25427bc23cc313b291cd93fdc6ab91"} Apr 21 04:39:10.357532 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:10.357452 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" event={"ID":"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc","Type":"ContainerStarted","Data":"9b3aad92e2dbdbef329c41db3ba65006c59c2796c64136bb6c54043aa468ebde"} Apr 21 04:39:10.361825 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:10.361771 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" event={"ID":"ade469fe-d322-489f-8cdf-95e181f955f6","Type":"ContainerStarted","Data":"7ecdbd4c301001e827853b551bf10c4cf9c785550d55d9ddd8b6c87951571d0e"} Apr 21 04:39:10.370814 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:10.369859 2570 generic.go:358] "Generic (PLEG): container finished" podID="a2b122c8-53b3-4280-9f62-b777ac256ac3" containerID="4b5be35f9cb7907591093cc117f885730529c3189b737fba735d6ab6bb5d4004" exitCode=0 Apr 21 04:39:10.370814 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:10.369897 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j8dxf" event={"ID":"a2b122c8-53b3-4280-9f62-b777ac256ac3","Type":"ContainerDied","Data":"4b5be35f9cb7907591093cc117f885730529c3189b737fba735d6ab6bb5d4004"} Apr 21 04:39:11.222468 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:11.222427 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:11.222685 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:11.222626 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:39:11.222685 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:11.222645 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5647cc45cd-54w7k: secret "image-registry-tls" not found Apr 21 04:39:11.222833 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:11.222724 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls podName:edbbec20-1f0f-4ad3-a24c-46bfdbc47e17 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:15.222704105 +0000 UTC m=+41.617065839 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls") pod "image-registry-5647cc45cd-54w7k" (UID: "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17") : secret "image-registry-tls" not found Apr 21 04:39:11.323640 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:11.323600 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert\") pod \"ingress-canary-xwsgf\" (UID: \"b0ff8417-568b-49f9-adc4-be1ff4ba8ca5\") " pod="openshift-ingress-canary/ingress-canary-xwsgf" Apr 21 04:39:11.323845 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:11.323652 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:11.323845 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:11.323799 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:11.323972 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:11.323863 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls podName:3d7b4054-d280-4074-b713-a7fe58a0ee82 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:15.323843013 +0000 UTC m=+41.718204736 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls") pod "dns-default-m4stv" (UID: "3d7b4054-d280-4074-b713-a7fe58a0ee82") : secret "dns-default-metrics-tls" not found Apr 21 04:39:11.324292 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:11.324270 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:11.324377 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:11.324323 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert podName:b0ff8417-568b-49f9-adc4-be1ff4ba8ca5 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:15.324308868 +0000 UTC m=+41.718670589 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert") pod "ingress-canary-xwsgf" (UID: "b0ff8417-568b-49f9-adc4-be1ff4ba8ca5") : secret "canary-serving-cert" not found Apr 21 04:39:11.380617 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:11.380546 2570 generic.go:358] "Generic (PLEG): container finished" podID="a2b122c8-53b3-4280-9f62-b777ac256ac3" containerID="f94a350d6d2b85a48153aea5d52b217aea6c414af87084c6bb4f8f8f8f8595cd" exitCode=0 Apr 21 04:39:11.380617 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:11.380602 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j8dxf" event={"ID":"a2b122c8-53b3-4280-9f62-b777ac256ac3","Type":"ContainerDied","Data":"f94a350d6d2b85a48153aea5d52b217aea6c414af87084c6bb4f8f8f8f8595cd"} Apr 21 04:39:15.258597 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:15.258560 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:15.258943 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:15.258704 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:39:15.258943 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:15.258724 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5647cc45cd-54w7k: secret "image-registry-tls" not found Apr 21 04:39:15.258943 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:15.258780 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls podName:edbbec20-1f0f-4ad3-a24c-46bfdbc47e17 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:23.258763733 +0000 UTC m=+49.653125455 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls") pod "image-registry-5647cc45cd-54w7k" (UID: "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17") : secret "image-registry-tls" not found Apr 21 04:39:15.359588 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:15.359559 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert\") pod \"ingress-canary-xwsgf\" (UID: \"b0ff8417-568b-49f9-adc4-be1ff4ba8ca5\") " pod="openshift-ingress-canary/ingress-canary-xwsgf" Apr 21 04:39:15.359689 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:15.359603 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:15.359742 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:15.359721 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:15.359785 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:15.359760 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:15.359823 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:15.359803 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert podName:b0ff8417-568b-49f9-adc4-be1ff4ba8ca5 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:23.35978454 +0000 UTC m=+49.754146258 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert") pod "ingress-canary-xwsgf" (UID: "b0ff8417-568b-49f9-adc4-be1ff4ba8ca5") : secret "canary-serving-cert" not found Apr 21 04:39:15.359823 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:15.359817 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls podName:3d7b4054-d280-4074-b713-a7fe58a0ee82 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:23.359811406 +0000 UTC m=+49.754173124 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls") pod "dns-default-m4stv" (UID: "3d7b4054-d280-4074-b713-a7fe58a0ee82") : secret "dns-default-metrics-tls" not found Apr 21 04:39:16.393220 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:16.393183 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j8dxf" event={"ID":"a2b122c8-53b3-4280-9f62-b777ac256ac3","Type":"ContainerStarted","Data":"423ce39379aee498f9bc1911647c7d53e6d05dfe9701580f66297ea6b4014fa1"} Apr 21 04:39:16.394568 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:16.394533 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f9b79b6d9-276jh" event={"ID":"b64d2b1f-0a86-4237-801c-025099403da9","Type":"ContainerStarted","Data":"f5c0d11516465eba57d13b0f79e091e97cd61655c62b615abf4c2d80e8662d7f"} Apr 21 04:39:16.395732 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:16.395709 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" event={"ID":"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc","Type":"ContainerStarted","Data":"05d0e7dbfb2b300a9f03d632984b8ceb468ceec2b6a042271aea7b6b69960f2e"} Apr 21 04:39:16.397170 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:16.397136 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" event={"ID":"ade469fe-d322-489f-8cdf-95e181f955f6","Type":"ContainerStarted","Data":"982d3bc35bc4edc11a0409ebec0c3c9a762d488f3d16504c9649fbab0664681d"} Apr 21 04:39:16.397334 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:16.397315 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" Apr 21 04:39:16.399143 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:16.399125 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" Apr 21 04:39:16.417505 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:16.417450 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-j8dxf" podStartSLOduration=10.106713097 podStartE2EDuration="42.417423741s" podCreationTimestamp="2026-04-21 04:38:34 +0000 UTC" firstStartedPulling="2026-04-21 04:38:37.085615141 +0000 UTC m=+3.479976859" lastFinishedPulling="2026-04-21 04:39:09.39632578 +0000 UTC m=+35.790687503" observedRunningTime="2026-04-21 04:39:16.417202475 +0000 UTC m=+42.811564242" watchObservedRunningTime="2026-04-21 04:39:16.417423741 +0000 UTC m=+42.811785484" Apr 21 04:39:16.432535 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:16.432475 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f9b79b6d9-276jh" podStartSLOduration=31.491645787 podStartE2EDuration="37.432463297s" podCreationTimestamp="2026-04-21 04:38:39 +0000 UTC" firstStartedPulling="2026-04-21 04:39:09.372666862 +0000 UTC m=+35.767028580" lastFinishedPulling="2026-04-21 04:39:15.313484371 +0000 UTC m=+41.707846090" observedRunningTime="2026-04-21 04:39:16.431638673 +0000 UTC m=+42.826000415" watchObservedRunningTime="2026-04-21 04:39:16.432463297 +0000 UTC m=+42.826825036" Apr 21 04:39:16.448467 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:16.448429 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" podStartSLOduration=31.489913392 podStartE2EDuration="37.44841814s" podCreationTimestamp="2026-04-21 04:38:39 +0000 UTC" firstStartedPulling="2026-04-21 04:39:09.372515106 +0000 UTC m=+35.766876829" lastFinishedPulling="2026-04-21 04:39:15.331019845 +0000 UTC m=+41.725381577" observedRunningTime="2026-04-21 04:39:16.447900393 +0000 UTC m=+42.842262134" watchObservedRunningTime="2026-04-21 04:39:16.44841814 +0000 UTC m=+42.842779881" Apr 21 04:39:18.403213 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:18.403174 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" event={"ID":"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc","Type":"ContainerStarted","Data":"dde4298fe7e1042e88652980ebc9568f31d0436e7e8e75f5600fbe11ccbafb17"} Apr 21 04:39:18.403213 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:18.403214 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" event={"ID":"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc","Type":"ContainerStarted","Data":"6a302c19df2bf94c6fb763342a86783a66b3881b0d7826abb8425c9207b67511"} Apr 21 04:39:18.422861 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:18.422813 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" podStartSLOduration=31.215364582 podStartE2EDuration="39.422801287s" podCreationTimestamp="2026-04-21 04:38:39 +0000 UTC" firstStartedPulling="2026-04-21 04:39:09.372715677 +0000 UTC m=+35.767077399" lastFinishedPulling="2026-04-21 04:39:17.580152383 +0000 UTC m=+43.974514104" observedRunningTime="2026-04-21 04:39:18.421842341 +0000 UTC m=+44.816204081" watchObservedRunningTime="2026-04-21 04:39:18.422801287 +0000 UTC m=+44.817163050" Apr 21 04:39:23.316928 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:23.316888 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:23.317293 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:23.317028 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:39:23.317293 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:23.317048 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5647cc45cd-54w7k: secret "image-registry-tls" not found Apr 21 04:39:23.317293 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:23.317114 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls podName:edbbec20-1f0f-4ad3-a24c-46bfdbc47e17 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:39.317097927 +0000 UTC m=+65.711459645 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls") pod "image-registry-5647cc45cd-54w7k" (UID: "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17") : secret "image-registry-tls" not found Apr 21 04:39:23.417947 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:23.417918 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert\") pod \"ingress-canary-xwsgf\" (UID: \"b0ff8417-568b-49f9-adc4-be1ff4ba8ca5\") " pod="openshift-ingress-canary/ingress-canary-xwsgf" Apr 21 04:39:23.417947 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:23.417949 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:23.418123 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:23.418069 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:23.418123 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:23.418083 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:23.418189 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:23.418130 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert podName:b0ff8417-568b-49f9-adc4-be1ff4ba8ca5 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:39.418113333 +0000 UTC m=+65.812475067 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert") pod "ingress-canary-xwsgf" (UID: "b0ff8417-568b-49f9-adc4-be1ff4ba8ca5") : secret "canary-serving-cert" not found Apr 21 04:39:23.418189 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:23.418144 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls podName:3d7b4054-d280-4074-b713-a7fe58a0ee82 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:39.418137871 +0000 UTC m=+65.812499589 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls") pod "dns-default-m4stv" (UID: "3d7b4054-d280-4074-b713-a7fe58a0ee82") : secret "dns-default-metrics-tls" not found Apr 21 04:39:33.390961 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:33.390931 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tvl5z" Apr 21 04:39:39.337857 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:39.337820 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:39:39.338294 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:39.337963 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:39:39.338294 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:39.337977 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5647cc45cd-54w7k: secret "image-registry-tls" not found Apr 21 04:39:39.338294 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:39.338043 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls podName:edbbec20-1f0f-4ad3-a24c-46bfdbc47e17 nodeName:}" failed. No retries permitted until 2026-04-21 04:40:11.338025475 +0000 UTC m=+97.732387216 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls") pod "image-registry-5647cc45cd-54w7k" (UID: "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17") : secret "image-registry-tls" not found Apr 21 04:39:39.438208 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:39.438174 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert\") pod \"ingress-canary-xwsgf\" (UID: \"b0ff8417-568b-49f9-adc4-be1ff4ba8ca5\") " pod="openshift-ingress-canary/ingress-canary-xwsgf" Apr 21 04:39:39.438208 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:39.438209 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:39:39.438441 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:39.438349 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:39.438441 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:39.438390 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:39.438441 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:39.438431 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls podName:3d7b4054-d280-4074-b713-a7fe58a0ee82 nodeName:}" failed. No retries permitted until 2026-04-21 04:40:11.43841703 +0000 UTC m=+97.832778747 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls") pod "dns-default-m4stv" (UID: "3d7b4054-d280-4074-b713-a7fe58a0ee82") : secret "dns-default-metrics-tls" not found Apr 21 04:39:39.438441 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:39.438443 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert podName:b0ff8417-568b-49f9-adc4-be1ff4ba8ca5 nodeName:}" failed. No retries permitted until 2026-04-21 04:40:11.438437214 +0000 UTC m=+97.832798932 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert") pod "ingress-canary-xwsgf" (UID: "b0ff8417-568b-49f9-adc4-be1ff4ba8ca5") : secret "canary-serving-cert" not found Apr 21 04:39:39.941993 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:39.941959 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret\") pod \"global-pull-secret-syncer-2wxxz\" (UID: \"a8a94be7-29d2-46f0-af2a-5a46e5fe8810\") " pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:39:39.941993 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:39.941999 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs\") pod \"network-metrics-daemon-c478k\" (UID: \"1500cffd-5994-4d2a-bd36-855f9cf3efe5\") " pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:39:39.944630 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:39.944611 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 04:39:39.944691 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:39.944638 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 04:39:39.952705 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:39.952689 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 04:39:39.952762 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:39:39.952739 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs podName:1500cffd-5994-4d2a-bd36-855f9cf3efe5 nodeName:}" failed. No retries permitted until 2026-04-21 04:40:43.952725174 +0000 UTC m=+130.347086892 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs") pod "network-metrics-daemon-c478k" (UID: "1500cffd-5994-4d2a-bd36-855f9cf3efe5") : secret "metrics-daemon-secret" not found Apr 21 04:39:39.955950 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:39.955933 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a8a94be7-29d2-46f0-af2a-5a46e5fe8810-original-pull-secret\") pod \"global-pull-secret-syncer-2wxxz\" (UID: \"a8a94be7-29d2-46f0-af2a-5a46e5fe8810\") " pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:39:40.042623 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:40.042598 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2ft\" (UniqueName: \"kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft\") pod \"network-check-target-bctwd\" (UID: \"da67b91f-e17f-4c7a-a45a-dddc62350e0e\") " pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:39:40.048111 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:40.045521 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 04:39:40.054971 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:40.054951 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 04:39:40.066006 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:40.065984 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd2ft\" (UniqueName: \"kubernetes.io/projected/da67b91f-e17f-4c7a-a45a-dddc62350e0e-kube-api-access-fd2ft\") pod \"network-check-target-bctwd\" (UID: \"da67b91f-e17f-4c7a-a45a-dddc62350e0e\") " pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:39:40.130966 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:40.130946 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2wxxz" Apr 21 04:39:40.140305 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:40.140286 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jgxqw\"" Apr 21 04:39:40.148118 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:40.148098 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:39:40.250773 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:40.250742 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2wxxz"] Apr 21 04:39:40.254613 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:39:40.254586 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8a94be7_29d2_46f0_af2a_5a46e5fe8810.slice/crio-eec4840bdf032fc4bcd3f121a9ee86ed9f8747b153f3f89672acf570fe9238bd WatchSource:0}: Error finding container eec4840bdf032fc4bcd3f121a9ee86ed9f8747b153f3f89672acf570fe9238bd: Status 404 returned error can't find the container with id eec4840bdf032fc4bcd3f121a9ee86ed9f8747b153f3f89672acf570fe9238bd Apr 21 04:39:40.266975 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:40.266950 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bctwd"] Apr 21 04:39:40.270460 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:39:40.270437 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda67b91f_e17f_4c7a_a45a_dddc62350e0e.slice/crio-ec0853ece79539bacf71b4d79c2ce519bf754e6f83405696e42933f5000ee97b WatchSource:0}: Error finding container ec0853ece79539bacf71b4d79c2ce519bf754e6f83405696e42933f5000ee97b: Status 404 returned error can't find the container with id ec0853ece79539bacf71b4d79c2ce519bf754e6f83405696e42933f5000ee97b Apr 21 04:39:40.447548 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:40.447463 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bctwd" event={"ID":"da67b91f-e17f-4c7a-a45a-dddc62350e0e","Type":"ContainerStarted","Data":"ec0853ece79539bacf71b4d79c2ce519bf754e6f83405696e42933f5000ee97b"} Apr 21 04:39:40.448413 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:40.448391 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2wxxz" event={"ID":"a8a94be7-29d2-46f0-af2a-5a46e5fe8810","Type":"ContainerStarted","Data":"eec4840bdf032fc4bcd3f121a9ee86ed9f8747b153f3f89672acf570fe9238bd"} Apr 21 04:39:45.459728 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:45.459690 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bctwd" event={"ID":"da67b91f-e17f-4c7a-a45a-dddc62350e0e","Type":"ContainerStarted","Data":"a93b313203d900a22696f0d86c93d300c5312c70cbd7d920643d5354d79aa0c5"} Apr 21 04:39:45.460208 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:45.459793 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:39:45.461046 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:45.461023 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2wxxz" event={"ID":"a8a94be7-29d2-46f0-af2a-5a46e5fe8810","Type":"ContainerStarted","Data":"90ce46230e94340ee529715affeae5a4eb86947afe1a4557ed3f37fec82dd701"} Apr 21 04:39:45.475015 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:45.474976 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bctwd" podStartSLOduration=67.234228505 podStartE2EDuration="1m11.474965526s" podCreationTimestamp="2026-04-21 04:38:34 +0000 UTC" firstStartedPulling="2026-04-21 04:39:40.272178112 +0000 UTC m=+66.666539830" lastFinishedPulling="2026-04-21 04:39:44.512915131 +0000 UTC m=+70.907276851" observedRunningTime="2026-04-21 04:39:45.47319646 +0000 UTC m=+71.867558212" watchObservedRunningTime="2026-04-21 04:39:45.474965526 +0000 UTC m=+71.869327265" Apr 21 04:39:45.487130 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:39:45.487089 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-2wxxz" podStartSLOduration=67.225985339 podStartE2EDuration="1m11.487079272s" podCreationTimestamp="2026-04-21 04:38:34 +0000 UTC" firstStartedPulling="2026-04-21 04:39:40.25643371 +0000 UTC m=+66.650795433" lastFinishedPulling="2026-04-21 04:39:44.517527647 +0000 UTC m=+70.911889366" observedRunningTime="2026-04-21 04:39:45.486804783 +0000 UTC m=+71.881166523" watchObservedRunningTime="2026-04-21 04:39:45.487079272 +0000 UTC m=+71.881441012" Apr 21 04:40:11.371773 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:40:11.371735 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:40:11.372137 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:40:11.371866 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:40:11.372137 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:40:11.371882 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5647cc45cd-54w7k: secret "image-registry-tls" not found Apr 21 04:40:11.372137 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:40:11.371936 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls podName:edbbec20-1f0f-4ad3-a24c-46bfdbc47e17 nodeName:}" failed. No retries permitted until 2026-04-21 04:41:15.371919302 +0000 UTC m=+161.766281020 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls") pod "image-registry-5647cc45cd-54w7k" (UID: "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17") : secret "image-registry-tls" not found Apr 21 04:40:11.472665 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:40:11.472578 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert\") pod \"ingress-canary-xwsgf\" (UID: \"b0ff8417-568b-49f9-adc4-be1ff4ba8ca5\") " pod="openshift-ingress-canary/ingress-canary-xwsgf" Apr 21 04:40:11.472665 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:40:11.472616 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:40:11.472838 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:40:11.472720 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:40:11.472838 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:40:11.472775 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert podName:b0ff8417-568b-49f9-adc4-be1ff4ba8ca5 nodeName:}" failed. No retries permitted until 2026-04-21 04:41:15.472760064 +0000 UTC m=+161.867121781 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert") pod "ingress-canary-xwsgf" (UID: "b0ff8417-568b-49f9-adc4-be1ff4ba8ca5") : secret "canary-serving-cert" not found Apr 21 04:40:11.472838 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:40:11.472773 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:40:11.472838 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:40:11.472800 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls podName:3d7b4054-d280-4074-b713-a7fe58a0ee82 nodeName:}" failed. No retries permitted until 2026-04-21 04:41:15.472794337 +0000 UTC m=+161.867156054 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls") pod "dns-default-m4stv" (UID: "3d7b4054-d280-4074-b713-a7fe58a0ee82") : secret "dns-default-metrics-tls" not found Apr 21 04:40:16.466354 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:40:16.466320 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bctwd" Apr 21 04:40:43.999761 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:40:43.999715 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs\") pod \"network-metrics-daemon-c478k\" (UID: \"1500cffd-5994-4d2a-bd36-855f9cf3efe5\") " pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:40:44.000156 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:40:43.999856 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 04:40:44.000156 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:40:43.999920 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs podName:1500cffd-5994-4d2a-bd36-855f9cf3efe5 nodeName:}" failed. No retries permitted until 2026-04-21 04:42:45.999904203 +0000 UTC m=+252.394265921 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs") pod "network-metrics-daemon-c478k" (UID: "1500cffd-5994-4d2a-bd36-855f9cf3efe5") : secret "metrics-daemon-secret" not found Apr 21 04:40:44.184172 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:40:44.184145 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m6sh5_c70f167b-0eff-4017-9272-7a887e981112/dns-node-resolver/0.log" Apr 21 04:40:45.183531 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:40:45.183489 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7cpf7_c6ff4930-586a-401d-8bf7-787218f408d0/node-ca/0.log" Apr 21 04:41:10.512576 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:41:10.512534 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" podUID="edbbec20-1f0f-4ad3-a24c-46bfdbc47e17" Apr 21 04:41:10.571954 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:41:10.571911 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-m4stv" podUID="3d7b4054-d280-4074-b713-a7fe58a0ee82" Apr 21 04:41:10.582035 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:41:10.582008 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-xwsgf" podUID="b0ff8417-568b-49f9-adc4-be1ff4ba8ca5" Apr 21 04:41:10.653700 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:10.653672 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:41:10.653847 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:10.653704 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m4stv" Apr 21 04:41:10.653847 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:10.653672 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xwsgf" Apr 21 04:41:12.110706 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.110677 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-cqh6m"] Apr 21 04:41:12.113713 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.113697 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cqh6m" Apr 21 04:41:12.115952 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.115930 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-x7jkx\"" Apr 21 04:41:12.116187 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.116169 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 04:41:12.116876 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.116857 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 04:41:12.116964 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.116913 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 04:41:12.117192 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.117177 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 04:41:12.125632 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.125609 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cqh6m"] Apr 21 04:41:12.205349 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.205324 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e2f06537-61ad-4db9-9f8b-8b588c0e0f9e-crio-socket\") pod \"insights-runtime-extractor-cqh6m\" (UID: \"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e\") " pod="openshift-insights/insights-runtime-extractor-cqh6m" Apr 21 04:41:12.205479 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.205360 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e2f06537-61ad-4db9-9f8b-8b588c0e0f9e-data-volume\") pod \"insights-runtime-extractor-cqh6m\" (UID: \"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e\") " pod="openshift-insights/insights-runtime-extractor-cqh6m" Apr 21 04:41:12.205479 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.205381 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e2f06537-61ad-4db9-9f8b-8b588c0e0f9e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cqh6m\" (UID: \"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e\") " pod="openshift-insights/insights-runtime-extractor-cqh6m" Apr 21 04:41:12.205479 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.205399 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e2f06537-61ad-4db9-9f8b-8b588c0e0f9e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cqh6m\" (UID: \"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e\") " pod="openshift-insights/insights-runtime-extractor-cqh6m" Apr 21 04:41:12.205479 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.205452 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlrnm\" (UniqueName: \"kubernetes.io/projected/e2f06537-61ad-4db9-9f8b-8b588c0e0f9e-kube-api-access-qlrnm\") pod \"insights-runtime-extractor-cqh6m\" (UID: \"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e\") " pod="openshift-insights/insights-runtime-extractor-cqh6m" Apr 21 04:41:12.244469 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:41:12.244437 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-c478k" podUID="1500cffd-5994-4d2a-bd36-855f9cf3efe5" Apr 21 04:41:12.306461 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.306435 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e2f06537-61ad-4db9-9f8b-8b588c0e0f9e-crio-socket\") pod \"insights-runtime-extractor-cqh6m\" (UID: \"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e\") " pod="openshift-insights/insights-runtime-extractor-cqh6m" Apr 21 04:41:12.306587 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.306468 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e2f06537-61ad-4db9-9f8b-8b588c0e0f9e-data-volume\") pod \"insights-runtime-extractor-cqh6m\" (UID: \"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e\") " pod="openshift-insights/insights-runtime-extractor-cqh6m" Apr 21 04:41:12.306587 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.306487 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e2f06537-61ad-4db9-9f8b-8b588c0e0f9e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cqh6m\" (UID: \"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e\") " pod="openshift-insights/insights-runtime-extractor-cqh6m" Apr 21 04:41:12.306587 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.306565 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e2f06537-61ad-4db9-9f8b-8b588c0e0f9e-crio-socket\") pod \"insights-runtime-extractor-cqh6m\" (UID: \"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e\") " pod="openshift-insights/insights-runtime-extractor-cqh6m" Apr 21 04:41:12.306705 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.306613 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e2f06537-61ad-4db9-9f8b-8b588c0e0f9e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cqh6m\" (UID: \"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e\") " pod="openshift-insights/insights-runtime-extractor-cqh6m" Apr 21 04:41:12.306705 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.306648 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qlrnm\" (UniqueName: \"kubernetes.io/projected/e2f06537-61ad-4db9-9f8b-8b588c0e0f9e-kube-api-access-qlrnm\") pod \"insights-runtime-extractor-cqh6m\" (UID: \"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e\") " pod="openshift-insights/insights-runtime-extractor-cqh6m" Apr 21 04:41:12.306778 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.306732 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e2f06537-61ad-4db9-9f8b-8b588c0e0f9e-data-volume\") pod \"insights-runtime-extractor-cqh6m\" (UID: \"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e\") " pod="openshift-insights/insights-runtime-extractor-cqh6m" Apr 21 04:41:12.307091 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.307076 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e2f06537-61ad-4db9-9f8b-8b588c0e0f9e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cqh6m\" (UID: \"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e\") " pod="openshift-insights/insights-runtime-extractor-cqh6m" Apr 21 04:41:12.308794 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.308772 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e2f06537-61ad-4db9-9f8b-8b588c0e0f9e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cqh6m\" (UID: \"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e\") " pod="openshift-insights/insights-runtime-extractor-cqh6m" Apr 21 04:41:12.316023 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.316004 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlrnm\" (UniqueName: \"kubernetes.io/projected/e2f06537-61ad-4db9-9f8b-8b588c0e0f9e-kube-api-access-qlrnm\") pod \"insights-runtime-extractor-cqh6m\" (UID: \"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e\") " pod="openshift-insights/insights-runtime-extractor-cqh6m" Apr 21 04:41:12.423168 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.423100 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cqh6m" Apr 21 04:41:12.539882 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.539850 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cqh6m"] Apr 21 04:41:12.542972 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:41:12.542939 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2f06537_61ad_4db9_9f8b_8b588c0e0f9e.slice/crio-36dad7826ad03457e7b9712f4aad5009ce6427e5928b25394b8d81884180eb6e WatchSource:0}: Error finding container 36dad7826ad03457e7b9712f4aad5009ce6427e5928b25394b8d81884180eb6e: Status 404 returned error can't find the container with id 36dad7826ad03457e7b9712f4aad5009ce6427e5928b25394b8d81884180eb6e Apr 21 04:41:12.659197 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.659158 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cqh6m" event={"ID":"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e","Type":"ContainerStarted","Data":"8579d1880479b9ad44271cc52744250e841c06e690f5ae571415e2898a69561d"} Apr 21 04:41:12.659197 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:12.659194 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cqh6m" event={"ID":"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e","Type":"ContainerStarted","Data":"36dad7826ad03457e7b9712f4aad5009ce6427e5928b25394b8d81884180eb6e"} Apr 21 04:41:13.668586 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:13.668531 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cqh6m" event={"ID":"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e","Type":"ContainerStarted","Data":"05f793c56c7613491cab1be76ddd7c45a5deb5efcbbc090746db8ad51200464e"} Apr 21 04:41:14.671753 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:14.671725 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cqh6m" event={"ID":"e2f06537-61ad-4db9-9f8b-8b588c0e0f9e","Type":"ContainerStarted","Data":"c18014343349b0607e9828ac87efc288592e5c9655a7ad7c096dec98417790b8"} Apr 21 04:41:14.688508 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:14.688451 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-cqh6m" podStartSLOduration=0.776919278 podStartE2EDuration="2.688435704s" podCreationTimestamp="2026-04-21 04:41:12 +0000 UTC" firstStartedPulling="2026-04-21 04:41:12.596673891 +0000 UTC m=+158.991035608" lastFinishedPulling="2026-04-21 04:41:14.508190302 +0000 UTC m=+160.902552034" observedRunningTime="2026-04-21 04:41:14.687566898 +0000 UTC m=+161.081928642" watchObservedRunningTime="2026-04-21 04:41:14.688435704 +0000 UTC m=+161.082797443" Apr 21 04:41:15.430922 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.430886 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:41:15.433309 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.433276 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls\") pod \"image-registry-5647cc45cd-54w7k\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:41:15.457706 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.457683 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nlhzm\"" Apr 21 04:41:15.464621 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.464600 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:41:15.531596 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.531565 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert\") pod \"ingress-canary-xwsgf\" (UID: \"b0ff8417-568b-49f9-adc4-be1ff4ba8ca5\") " pod="openshift-ingress-canary/ingress-canary-xwsgf" Apr 21 04:41:15.531752 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.531606 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:41:15.533827 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.533803 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d7b4054-d280-4074-b713-a7fe58a0ee82-metrics-tls\") pod \"dns-default-m4stv\" (UID: \"3d7b4054-d280-4074-b713-a7fe58a0ee82\") " pod="openshift-dns/dns-default-m4stv" Apr 21 04:41:15.534073 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.534055 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0ff8417-568b-49f9-adc4-be1ff4ba8ca5-cert\") pod \"ingress-canary-xwsgf\" (UID: \"b0ff8417-568b-49f9-adc4-be1ff4ba8ca5\") " pod="openshift-ingress-canary/ingress-canary-xwsgf" Apr 21 04:41:15.584312 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.584280 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5647cc45cd-54w7k"] Apr 21 04:41:15.592385 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:41:15.592358 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedbbec20_1f0f_4ad3_a24c_46bfdbc47e17.slice/crio-02a01c2884bda90ac45c3c6d84d4564c6eff232885aa199ff6132238589d91a8 WatchSource:0}: Error finding container 02a01c2884bda90ac45c3c6d84d4564c6eff232885aa199ff6132238589d91a8: Status 404 returned error can't find the container with id 02a01c2884bda90ac45c3c6d84d4564c6eff232885aa199ff6132238589d91a8 Apr 21 04:41:15.675216 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.675187 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" event={"ID":"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17","Type":"ContainerStarted","Data":"0def0638e668a2ec482bc977d49b58d7ca5d396d724d74328df9d6c1d39cd4a0"} Apr 21 04:41:15.675581 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.675229 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" event={"ID":"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17","Type":"ContainerStarted","Data":"02a01c2884bda90ac45c3c6d84d4564c6eff232885aa199ff6132238589d91a8"} Apr 21 04:41:15.675581 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.675333 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:41:15.676538 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.676516 2570 generic.go:358] "Generic (PLEG): container finished" podID="b64d2b1f-0a86-4237-801c-025099403da9" containerID="f5c0d11516465eba57d13b0f79e091e97cd61655c62b615abf4c2d80e8662d7f" exitCode=255 Apr 21 04:41:15.676637 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.676544 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f9b79b6d9-276jh" event={"ID":"b64d2b1f-0a86-4237-801c-025099403da9","Type":"ContainerDied","Data":"f5c0d11516465eba57d13b0f79e091e97cd61655c62b615abf4c2d80e8662d7f"} Apr 21 04:41:15.676847 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.676831 2570 scope.go:117] "RemoveContainer" containerID="f5c0d11516465eba57d13b0f79e091e97cd61655c62b615abf4c2d80e8662d7f" Apr 21 04:41:15.677844 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.677824 2570 generic.go:358] "Generic (PLEG): container finished" podID="ade469fe-d322-489f-8cdf-95e181f955f6" containerID="982d3bc35bc4edc11a0409ebec0c3c9a762d488f3d16504c9649fbab0664681d" exitCode=1 Apr 21 04:41:15.677952 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.677845 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" event={"ID":"ade469fe-d322-489f-8cdf-95e181f955f6","Type":"ContainerDied","Data":"982d3bc35bc4edc11a0409ebec0c3c9a762d488f3d16504c9649fbab0664681d"} Apr 21 04:41:15.678331 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.678316 2570 scope.go:117] "RemoveContainer" containerID="982d3bc35bc4edc11a0409ebec0c3c9a762d488f3d16504c9649fbab0664681d" Apr 21 04:41:15.699275 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.699232 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" podStartSLOduration=140.699220209 podStartE2EDuration="2m20.699220209s" podCreationTimestamp="2026-04-21 04:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:41:15.698231972 +0000 UTC m=+162.092593717" watchObservedRunningTime="2026-04-21 04:41:15.699220209 +0000 UTC m=+162.093581949" Apr 21 04:41:15.756710 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.756679 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ljx4n\"" Apr 21 04:41:15.756883 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.756734 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f9n52\"" Apr 21 04:41:15.765588 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.765563 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m4stv" Apr 21 04:41:15.765726 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.765660 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xwsgf" Apr 21 04:41:15.906381 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.906350 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m4stv"] Apr 21 04:41:15.910347 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:41:15.910308 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d7b4054_d280_4074_b713_a7fe58a0ee82.slice/crio-7c92a079658b097c3d4638626388749e0fc67d8dbc1bc09ea3db1dad863b9660 WatchSource:0}: Error finding container 7c92a079658b097c3d4638626388749e0fc67d8dbc1bc09ea3db1dad863b9660: Status 404 returned error can't find the container with id 7c92a079658b097c3d4638626388749e0fc67d8dbc1bc09ea3db1dad863b9660 Apr 21 04:41:15.918920 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:15.918851 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xwsgf"] Apr 21 04:41:15.922439 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:41:15.922417 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0ff8417_568b_49f9_adc4_be1ff4ba8ca5.slice/crio-39bce5f51508b4d7f2c02f2720f7a5b8ba7699c4896e103bb1732d34729be668 WatchSource:0}: Error finding container 39bce5f51508b4d7f2c02f2720f7a5b8ba7699c4896e103bb1732d34729be668: Status 404 returned error can't find the container with id 39bce5f51508b4d7f2c02f2720f7a5b8ba7699c4896e103bb1732d34729be668 Apr 21 04:41:16.397509 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:16.397456 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" Apr 21 04:41:16.683570 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:16.683466 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" event={"ID":"ade469fe-d322-489f-8cdf-95e181f955f6","Type":"ContainerStarted","Data":"d318ca383b1b2794ceae45714045e404c77ea191c79ea47b2b4a185ef9b5c358"} Apr 21 04:41:16.684086 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:16.684067 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" Apr 21 04:41:16.684546 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:16.684522 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9f9459848-qc2s5" Apr 21 04:41:16.684980 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:16.684954 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m4stv" event={"ID":"3d7b4054-d280-4074-b713-a7fe58a0ee82","Type":"ContainerStarted","Data":"7c92a079658b097c3d4638626388749e0fc67d8dbc1bc09ea3db1dad863b9660"} Apr 21 04:41:16.686193 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:16.686169 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xwsgf" event={"ID":"b0ff8417-568b-49f9-adc4-be1ff4ba8ca5","Type":"ContainerStarted","Data":"39bce5f51508b4d7f2c02f2720f7a5b8ba7699c4896e103bb1732d34729be668"} Apr 21 04:41:16.688701 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:16.688659 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f9b79b6d9-276jh" event={"ID":"b64d2b1f-0a86-4237-801c-025099403da9","Type":"ContainerStarted","Data":"7cb2698859a49f3c63322f55db5611196fcb11136953a52899eed7666edb0135"} Apr 21 04:41:18.696862 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:18.696829 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m4stv" event={"ID":"3d7b4054-d280-4074-b713-a7fe58a0ee82","Type":"ContainerStarted","Data":"9f3c864e807166ac2acb2d22e95d0112295839cdb732695d5c3f0a1f2b010a9e"} Apr 21 04:41:18.696862 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:18.696863 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m4stv" event={"ID":"3d7b4054-d280-4074-b713-a7fe58a0ee82","Type":"ContainerStarted","Data":"7e102c01f7ad4e02e385df81cb5ebe5ac35435f22b0248d671d360b4a006935f"} Apr 21 04:41:18.697300 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:18.697016 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-m4stv" Apr 21 04:41:18.698138 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:18.698118 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xwsgf" event={"ID":"b0ff8417-568b-49f9-adc4-be1ff4ba8ca5","Type":"ContainerStarted","Data":"55c6a9338e144a4836c7907d477cea8e3455b5cac03cdc9e844287344a0c5c3c"} Apr 21 04:41:18.713742 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:18.713702 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-m4stv" podStartSLOduration=129.974504311 podStartE2EDuration="2m11.713690733s" podCreationTimestamp="2026-04-21 04:39:07 +0000 UTC" firstStartedPulling="2026-04-21 04:41:15.91280117 +0000 UTC m=+162.307162893" lastFinishedPulling="2026-04-21 04:41:17.651987586 +0000 UTC m=+164.046349315" observedRunningTime="2026-04-21 04:41:18.712289243 +0000 UTC m=+165.106650994" watchObservedRunningTime="2026-04-21 04:41:18.713690733 +0000 UTC m=+165.108052472" Apr 21 04:41:18.726249 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:18.726210 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xwsgf" podStartSLOduration=129.995451897 podStartE2EDuration="2m11.726199157s" podCreationTimestamp="2026-04-21 04:39:07 +0000 UTC" firstStartedPulling="2026-04-21 04:41:15.924811841 +0000 UTC m=+162.319173562" lastFinishedPulling="2026-04-21 04:41:17.655559105 +0000 UTC m=+164.049920822" observedRunningTime="2026-04-21 04:41:18.72564006 +0000 UTC m=+165.120001803" watchObservedRunningTime="2026-04-21 04:41:18.726199157 +0000 UTC m=+165.120560915" Apr 21 04:41:20.693065 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.693032 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hxlxc"] Apr 21 04:41:20.696194 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.696177 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.698625 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.698604 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-p4hrl\"" Apr 21 04:41:20.698770 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.698604 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 04:41:20.698828 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.698797 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 04:41:20.698922 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.698884 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 04:41:20.699510 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.699473 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 04:41:20.699745 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.699719 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 04:41:20.699839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.699751 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 04:41:20.770206 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.770180 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/24a1ca69-0786-4476-865b-922206b6c523-node-exporter-wtmp\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.770326 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.770220 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/24a1ca69-0786-4476-865b-922206b6c523-node-exporter-accelerators-collector-config\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.770326 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.770243 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24a1ca69-0786-4476-865b-922206b6c523-metrics-client-ca\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.770326 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.770309 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/24a1ca69-0786-4476-865b-922206b6c523-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.770471 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.770363 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/24a1ca69-0786-4476-865b-922206b6c523-root\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.770471 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.770385 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/24a1ca69-0786-4476-865b-922206b6c523-node-exporter-textfile\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.770471 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.770405 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24a1ca69-0786-4476-865b-922206b6c523-sys\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.770471 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.770420 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkbfg\" (UniqueName: \"kubernetes.io/projected/24a1ca69-0786-4476-865b-922206b6c523-kube-api-access-bkbfg\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.770471 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.770441 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/24a1ca69-0786-4476-865b-922206b6c523-node-exporter-tls\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.871944 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.871908 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/24a1ca69-0786-4476-865b-922206b6c523-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.872095 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.871985 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/24a1ca69-0786-4476-865b-922206b6c523-root\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.872095 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.872021 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/24a1ca69-0786-4476-865b-922206b6c523-node-exporter-textfile\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.872095 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.872056 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24a1ca69-0786-4476-865b-922206b6c523-sys\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.872095 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.872080 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkbfg\" (UniqueName: \"kubernetes.io/projected/24a1ca69-0786-4476-865b-922206b6c523-kube-api-access-bkbfg\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.872292 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.872118 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/24a1ca69-0786-4476-865b-922206b6c523-node-exporter-tls\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.872292 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.872166 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/24a1ca69-0786-4476-865b-922206b6c523-node-exporter-wtmp\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.872292 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.872214 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/24a1ca69-0786-4476-865b-922206b6c523-node-exporter-accelerators-collector-config\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.872292 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.872264 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24a1ca69-0786-4476-865b-922206b6c523-metrics-client-ca\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.873098 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.872897 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24a1ca69-0786-4476-865b-922206b6c523-metrics-client-ca\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.874661 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.873587 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/24a1ca69-0786-4476-865b-922206b6c523-root\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.874661 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.873876 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/24a1ca69-0786-4476-865b-922206b6c523-node-exporter-textfile\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.874661 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.874020 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/24a1ca69-0786-4476-865b-922206b6c523-node-exporter-wtmp\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.874661 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.874472 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/24a1ca69-0786-4476-865b-922206b6c523-node-exporter-accelerators-collector-config\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.874661 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.874583 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24a1ca69-0786-4476-865b-922206b6c523-sys\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.876228 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.875684 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/24a1ca69-0786-4476-865b-922206b6c523-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.876386 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.876369 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/24a1ca69-0786-4476-865b-922206b6c523-node-exporter-tls\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:20.881752 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:20.881733 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkbfg\" (UniqueName: \"kubernetes.io/projected/24a1ca69-0786-4476-865b-922206b6c523-kube-api-access-bkbfg\") pod \"node-exporter-hxlxc\" (UID: \"24a1ca69-0786-4476-865b-922206b6c523\") " pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:21.005981 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:21.005924 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hxlxc" Apr 21 04:41:21.013398 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:41:21.013376 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24a1ca69_0786_4476_865b_922206b6c523.slice/crio-0d33a623397ba01b4093c5ab939a3b100cc7911c242d02957a92a14a20055358 WatchSource:0}: Error finding container 0d33a623397ba01b4093c5ab939a3b100cc7911c242d02957a92a14a20055358: Status 404 returned error can't find the container with id 0d33a623397ba01b4093c5ab939a3b100cc7911c242d02957a92a14a20055358 Apr 21 04:41:21.706923 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:21.706899 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hxlxc" event={"ID":"24a1ca69-0786-4476-865b-922206b6c523","Type":"ContainerStarted","Data":"0d33a623397ba01b4093c5ab939a3b100cc7911c242d02957a92a14a20055358"} Apr 21 04:41:22.711339 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:22.711305 2570 generic.go:358] "Generic (PLEG): container finished" podID="24a1ca69-0786-4476-865b-922206b6c523" containerID="2c41a703c8dc82f39facfb942297814ed526952d3fb652d8694b4ae3dadecc75" exitCode=0 Apr 21 04:41:22.711709 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:22.711389 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hxlxc" event={"ID":"24a1ca69-0786-4476-865b-922206b6c523","Type":"ContainerDied","Data":"2c41a703c8dc82f39facfb942297814ed526952d3fb652d8694b4ae3dadecc75"} Apr 21 04:41:23.715864 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:23.715779 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hxlxc" event={"ID":"24a1ca69-0786-4476-865b-922206b6c523","Type":"ContainerStarted","Data":"f4a3c3b1fa09b03e650d39708245409135f4ed84c8a8d50d662488392e9280db"} Apr 21 04:41:23.715864 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:23.715830 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hxlxc" event={"ID":"24a1ca69-0786-4476-865b-922206b6c523","Type":"ContainerStarted","Data":"61348afbb00964a72aa6164df1e1d0cdf1b97fdbf919e1ac4cd81f624132705d"} Apr 21 04:41:23.733991 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:23.733941 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hxlxc" podStartSLOduration=3.063777302 podStartE2EDuration="3.733927496s" podCreationTimestamp="2026-04-21 04:41:20 +0000 UTC" firstStartedPulling="2026-04-21 04:41:21.014990331 +0000 UTC m=+167.409352049" lastFinishedPulling="2026-04-21 04:41:21.685140525 +0000 UTC m=+168.079502243" observedRunningTime="2026-04-21 04:41:23.732679821 +0000 UTC m=+170.127041561" watchObservedRunningTime="2026-04-21 04:41:23.733927496 +0000 UTC m=+170.128289239" Apr 21 04:41:27.215156 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:27.215113 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:41:28.703156 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:28.703127 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-m4stv" Apr 21 04:41:34.394512 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:34.394468 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5647cc45cd-54w7k"] Apr 21 04:41:34.398443 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:34.398411 2570 patch_prober.go:28] interesting pod/image-registry-5647cc45cd-54w7k container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 04:41:34.398585 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:34.398475 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" podUID="edbbec20-1f0f-4ad3-a24c-46bfdbc47e17" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:41:44.398536 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:44.398483 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:41:59.412889 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.412833 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" podUID="edbbec20-1f0f-4ad3-a24c-46bfdbc47e17" containerName="registry" containerID="cri-o://0def0638e668a2ec482bc977d49b58d7ca5d396d724d74328df9d6c1d39cd4a0" gracePeriod=30 Apr 21 04:41:59.647997 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.647975 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:41:59.651953 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.651932 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-ca-trust-extracted\") pod \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " Apr 21 04:41:59.652063 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.651969 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-image-registry-private-configuration\") pod \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " Apr 21 04:41:59.652063 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.651995 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm7p9\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-kube-api-access-tm7p9\") pod \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " Apr 21 04:41:59.652063 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.652015 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls\") pod \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " Apr 21 04:41:59.652063 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.652034 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-trusted-ca\") pod \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " Apr 21 04:41:59.652266 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.652083 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-bound-sa-token\") pod \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " Apr 21 04:41:59.652266 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.652106 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-certificates\") pod \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " Apr 21 04:41:59.652665 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.652638 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17" (UID: "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:41:59.652665 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.652647 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17" (UID: "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:41:59.654489 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.654419 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17" (UID: "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:41:59.654666 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.654641 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-kube-api-access-tm7p9" (OuterVolumeSpecName: "kube-api-access-tm7p9") pod "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17" (UID: "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17"). InnerVolumeSpecName "kube-api-access-tm7p9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:41:59.654731 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.654658 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17" (UID: "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:41:59.654786 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.654758 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17" (UID: "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:41:59.661180 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.661158 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17" (UID: "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:41:59.753330 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.753274 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-installation-pull-secrets\") pod \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\" (UID: \"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17\") " Apr 21 04:41:59.753444 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.753428 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-bound-sa-token\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:41:59.753483 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.753452 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-certificates\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:41:59.753483 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.753467 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-ca-trust-extracted\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:41:59.753611 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.753482 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-image-registry-private-configuration\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:41:59.753611 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.753525 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tm7p9\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-kube-api-access-tm7p9\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:41:59.753611 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.753540 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-registry-tls\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:41:59.753611 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.753554 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-trusted-ca\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:41:59.755085 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.755066 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17" (UID: "edbbec20-1f0f-4ad3-a24c-46bfdbc47e17"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:41:59.806471 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.806449 2570 generic.go:358] "Generic (PLEG): container finished" podID="edbbec20-1f0f-4ad3-a24c-46bfdbc47e17" containerID="0def0638e668a2ec482bc977d49b58d7ca5d396d724d74328df9d6c1d39cd4a0" exitCode=0 Apr 21 04:41:59.806564 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.806512 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" event={"ID":"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17","Type":"ContainerDied","Data":"0def0638e668a2ec482bc977d49b58d7ca5d396d724d74328df9d6c1d39cd4a0"} Apr 21 04:41:59.806564 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.806536 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" event={"ID":"edbbec20-1f0f-4ad3-a24c-46bfdbc47e17","Type":"ContainerDied","Data":"02a01c2884bda90ac45c3c6d84d4564c6eff232885aa199ff6132238589d91a8"} Apr 21 04:41:59.806564 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.806561 2570 scope.go:117] "RemoveContainer" containerID="0def0638e668a2ec482bc977d49b58d7ca5d396d724d74328df9d6c1d39cd4a0" Apr 21 04:41:59.806660 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.806516 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5647cc45cd-54w7k" Apr 21 04:41:59.814020 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.813998 2570 scope.go:117] "RemoveContainer" containerID="0def0638e668a2ec482bc977d49b58d7ca5d396d724d74328df9d6c1d39cd4a0" Apr 21 04:41:59.814366 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:41:59.814279 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0def0638e668a2ec482bc977d49b58d7ca5d396d724d74328df9d6c1d39cd4a0\": container with ID starting with 0def0638e668a2ec482bc977d49b58d7ca5d396d724d74328df9d6c1d39cd4a0 not found: ID does not exist" containerID="0def0638e668a2ec482bc977d49b58d7ca5d396d724d74328df9d6c1d39cd4a0" Apr 21 04:41:59.814366 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.814306 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0def0638e668a2ec482bc977d49b58d7ca5d396d724d74328df9d6c1d39cd4a0"} err="failed to get container status \"0def0638e668a2ec482bc977d49b58d7ca5d396d724d74328df9d6c1d39cd4a0\": rpc error: code = NotFound desc = could not find container \"0def0638e668a2ec482bc977d49b58d7ca5d396d724d74328df9d6c1d39cd4a0\": container with ID starting with 0def0638e668a2ec482bc977d49b58d7ca5d396d724d74328df9d6c1d39cd4a0 not found: ID does not exist" Apr 21 04:41:59.825829 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.825807 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5647cc45cd-54w7k"] Apr 21 04:41:59.829534 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.829512 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5647cc45cd-54w7k"] Apr 21 04:41:59.853807 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:41:59.853788 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17-installation-pull-secrets\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:42:00.219142 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:00.219110 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edbbec20-1f0f-4ad3-a24c-46bfdbc47e17" path="/var/lib/kubelet/pods/edbbec20-1f0f-4ad3-a24c-46bfdbc47e17/volumes" Apr 21 04:42:07.847974 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:07.847938 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" podUID="c7b58ecd-52e3-45cd-9fc2-fc066d5faadc" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 04:42:17.848135 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:17.848092 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" podUID="c7b58ecd-52e3-45cd-9fc2-fc066d5faadc" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 04:42:27.848080 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:27.848038 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" podUID="c7b58ecd-52e3-45cd-9fc2-fc066d5faadc" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 04:42:27.848457 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:27.848116 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" Apr 21 04:42:27.848585 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:27.848567 2570 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"dde4298fe7e1042e88652980ebc9568f31d0436e7e8e75f5600fbe11ccbafb17"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 21 04:42:27.848629 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:27.848605 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" podUID="c7b58ecd-52e3-45cd-9fc2-fc066d5faadc" containerName="service-proxy" containerID="cri-o://dde4298fe7e1042e88652980ebc9568f31d0436e7e8e75f5600fbe11ccbafb17" gracePeriod=30 Apr 21 04:42:28.881704 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:28.881673 2570 generic.go:358] "Generic (PLEG): container finished" podID="c7b58ecd-52e3-45cd-9fc2-fc066d5faadc" containerID="dde4298fe7e1042e88652980ebc9568f31d0436e7e8e75f5600fbe11ccbafb17" exitCode=2 Apr 21 04:42:28.882070 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:28.881731 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" event={"ID":"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc","Type":"ContainerDied","Data":"dde4298fe7e1042e88652980ebc9568f31d0436e7e8e75f5600fbe11ccbafb17"} Apr 21 04:42:28.882070 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:28.881767 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bfc7b77f5-7vxxc" event={"ID":"c7b58ecd-52e3-45cd-9fc2-fc066d5faadc","Type":"ContainerStarted","Data":"38e8b085a14e92ba73f673e0c90522788048b48e98646d073f7d8eb74e06720b"} Apr 21 04:42:46.074164 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:46.074128 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs\") pod \"network-metrics-daemon-c478k\" (UID: \"1500cffd-5994-4d2a-bd36-855f9cf3efe5\") " pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:42:46.076444 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:46.076410 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1500cffd-5994-4d2a-bd36-855f9cf3efe5-metrics-certs\") pod \"network-metrics-daemon-c478k\" (UID: \"1500cffd-5994-4d2a-bd36-855f9cf3efe5\") " pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:42:46.118096 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:46.118062 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-979g8\"" Apr 21 04:42:46.126836 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:46.126812 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c478k" Apr 21 04:42:46.241372 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:46.241343 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c478k"] Apr 21 04:42:46.247118 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:42:46.247074 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1500cffd_5994_4d2a_bd36_855f9cf3efe5.slice/crio-1dc34504eee7b37f85a6bea278bae5eab127a9d8ef0fc88562cafa6e6640c2f7 WatchSource:0}: Error finding container 1dc34504eee7b37f85a6bea278bae5eab127a9d8ef0fc88562cafa6e6640c2f7: Status 404 returned error can't find the container with id 1dc34504eee7b37f85a6bea278bae5eab127a9d8ef0fc88562cafa6e6640c2f7 Apr 21 04:42:46.925237 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:46.925195 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c478k" event={"ID":"1500cffd-5994-4d2a-bd36-855f9cf3efe5","Type":"ContainerStarted","Data":"1dc34504eee7b37f85a6bea278bae5eab127a9d8ef0fc88562cafa6e6640c2f7"} Apr 21 04:42:47.931738 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:47.931698 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c478k" event={"ID":"1500cffd-5994-4d2a-bd36-855f9cf3efe5","Type":"ContainerStarted","Data":"e95303b66795aa8bab533d7b02cc227c211682fd734d0de76d85aa04c2e04745"} Apr 21 04:42:47.931738 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:47.931738 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c478k" event={"ID":"1500cffd-5994-4d2a-bd36-855f9cf3efe5","Type":"ContainerStarted","Data":"649fdf766389ea0413a7fe5dc61fcac3984591a0f58967b08674b869bf1c33be"} Apr 21 04:42:47.946212 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:42:47.946161 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-c478k" podStartSLOduration=253.063812716 podStartE2EDuration="4m13.946149795s" podCreationTimestamp="2026-04-21 04:38:34 +0000 UTC" firstStartedPulling="2026-04-21 04:42:46.248832166 +0000 UTC m=+252.643193884" lastFinishedPulling="2026-04-21 04:42:47.131169245 +0000 UTC m=+253.525530963" observedRunningTime="2026-04-21 04:42:47.945340562 +0000 UTC m=+254.339702302" watchObservedRunningTime="2026-04-21 04:42:47.946149795 +0000 UTC m=+254.340511534" Apr 21 04:43:34.125353 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:43:34.125324 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 04:44:25.562820 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.562745 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-f65bw"] Apr 21 04:44:25.563191 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.562977 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edbbec20-1f0f-4ad3-a24c-46bfdbc47e17" containerName="registry" Apr 21 04:44:25.563191 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.562988 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="edbbec20-1f0f-4ad3-a24c-46bfdbc47e17" containerName="registry" Apr 21 04:44:25.563191 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.563034 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="edbbec20-1f0f-4ad3-a24c-46bfdbc47e17" containerName="registry" Apr 21 04:44:25.565696 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.565680 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-f65bw" Apr 21 04:44:25.568128 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.568098 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-65qwf\"" Apr 21 04:44:25.568253 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.568170 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 04:44:25.568996 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.568983 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 04:44:25.575838 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.575818 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-f65bw"] Apr 21 04:44:25.647598 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.647571 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d21025eb-0ba7-4ff0-9728-3e1e0bdfac9e-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-f65bw\" (UID: \"d21025eb-0ba7-4ff0-9728-3e1e0bdfac9e\") " pod="cert-manager/cert-manager-webhook-587ccfb98-f65bw" Apr 21 04:44:25.647704 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.647643 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcvqw\" (UniqueName: \"kubernetes.io/projected/d21025eb-0ba7-4ff0-9728-3e1e0bdfac9e-kube-api-access-bcvqw\") pod \"cert-manager-webhook-587ccfb98-f65bw\" (UID: \"d21025eb-0ba7-4ff0-9728-3e1e0bdfac9e\") " pod="cert-manager/cert-manager-webhook-587ccfb98-f65bw" Apr 21 04:44:25.748040 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.748018 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcvqw\" (UniqueName: \"kubernetes.io/projected/d21025eb-0ba7-4ff0-9728-3e1e0bdfac9e-kube-api-access-bcvqw\") pod \"cert-manager-webhook-587ccfb98-f65bw\" (UID: \"d21025eb-0ba7-4ff0-9728-3e1e0bdfac9e\") " pod="cert-manager/cert-manager-webhook-587ccfb98-f65bw" Apr 21 04:44:25.748137 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.748062 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d21025eb-0ba7-4ff0-9728-3e1e0bdfac9e-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-f65bw\" (UID: \"d21025eb-0ba7-4ff0-9728-3e1e0bdfac9e\") " pod="cert-manager/cert-manager-webhook-587ccfb98-f65bw" Apr 21 04:44:25.759138 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.759106 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d21025eb-0ba7-4ff0-9728-3e1e0bdfac9e-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-f65bw\" (UID: \"d21025eb-0ba7-4ff0-9728-3e1e0bdfac9e\") " pod="cert-manager/cert-manager-webhook-587ccfb98-f65bw" Apr 21 04:44:25.759336 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.759318 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcvqw\" (UniqueName: \"kubernetes.io/projected/d21025eb-0ba7-4ff0-9728-3e1e0bdfac9e-kube-api-access-bcvqw\") pod \"cert-manager-webhook-587ccfb98-f65bw\" (UID: \"d21025eb-0ba7-4ff0-9728-3e1e0bdfac9e\") " pod="cert-manager/cert-manager-webhook-587ccfb98-f65bw" Apr 21 04:44:25.874465 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.874413 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-f65bw" Apr 21 04:44:25.985679 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.985555 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-f65bw"] Apr 21 04:44:25.989590 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:44:25.989562 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd21025eb_0ba7_4ff0_9728_3e1e0bdfac9e.slice/crio-b184f67f3e7f3405234ccd1568dda105241e4b618e652fbe990f94e4974edc1d WatchSource:0}: Error finding container b184f67f3e7f3405234ccd1568dda105241e4b618e652fbe990f94e4974edc1d: Status 404 returned error can't find the container with id b184f67f3e7f3405234ccd1568dda105241e4b618e652fbe990f94e4974edc1d Apr 21 04:44:25.991555 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:25.991536 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:44:26.179814 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:26.179731 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-f65bw" event={"ID":"d21025eb-0ba7-4ff0-9728-3e1e0bdfac9e","Type":"ContainerStarted","Data":"b184f67f3e7f3405234ccd1568dda105241e4b618e652fbe990f94e4974edc1d"} Apr 21 04:44:29.190523 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:29.190478 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-f65bw" event={"ID":"d21025eb-0ba7-4ff0-9728-3e1e0bdfac9e","Type":"ContainerStarted","Data":"7b34d8a240caab6f8cd030eced3662f277e600204dd2a587891352847ebe6311"} Apr 21 04:44:29.190879 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:29.190540 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-f65bw" Apr 21 04:44:29.213509 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:29.213462 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-f65bw" podStartSLOduration=1.570973409 podStartE2EDuration="4.213447481s" podCreationTimestamp="2026-04-21 04:44:25 +0000 UTC" firstStartedPulling="2026-04-21 04:44:25.991700079 +0000 UTC m=+352.386061797" lastFinishedPulling="2026-04-21 04:44:28.634174152 +0000 UTC m=+355.028535869" observedRunningTime="2026-04-21 04:44:29.211969825 +0000 UTC m=+355.606331565" watchObservedRunningTime="2026-04-21 04:44:29.213447481 +0000 UTC m=+355.607809221" Apr 21 04:44:35.194732 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:35.194706 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-f65bw" Apr 21 04:44:51.085408 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.085376 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x"] Apr 21 04:44:51.093119 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.093094 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x" Apr 21 04:44:51.095411 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.095384 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 04:44:51.095571 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.095420 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-9fpwh\"" Apr 21 04:44:51.095571 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.095546 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 04:44:51.095678 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.095669 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 04:44:51.095841 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.095829 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 04:44:51.105333 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.105314 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x"] Apr 21 04:44:51.231463 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.231429 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff22f569-478b-4a9f-b91c-f437a2794ab9-webhook-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-ws65x\" (UID: \"ff22f569-478b-4a9f-b91c-f437a2794ab9\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x" Apr 21 04:44:51.231644 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.231469 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff22f569-478b-4a9f-b91c-f437a2794ab9-apiservice-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-ws65x\" (UID: \"ff22f569-478b-4a9f-b91c-f437a2794ab9\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x" Apr 21 04:44:51.231644 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.231552 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjmdc\" (UniqueName: \"kubernetes.io/projected/ff22f569-478b-4a9f-b91c-f437a2794ab9-kube-api-access-sjmdc\") pod \"opendatahub-operator-controller-manager-55ddb68486-ws65x\" (UID: \"ff22f569-478b-4a9f-b91c-f437a2794ab9\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x" Apr 21 04:44:51.332722 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.332689 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff22f569-478b-4a9f-b91c-f437a2794ab9-webhook-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-ws65x\" (UID: \"ff22f569-478b-4a9f-b91c-f437a2794ab9\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x" Apr 21 04:44:51.332874 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.332746 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff22f569-478b-4a9f-b91c-f437a2794ab9-apiservice-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-ws65x\" (UID: \"ff22f569-478b-4a9f-b91c-f437a2794ab9\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x" Apr 21 04:44:51.332874 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.332814 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjmdc\" (UniqueName: \"kubernetes.io/projected/ff22f569-478b-4a9f-b91c-f437a2794ab9-kube-api-access-sjmdc\") pod \"opendatahub-operator-controller-manager-55ddb68486-ws65x\" (UID: \"ff22f569-478b-4a9f-b91c-f437a2794ab9\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x" Apr 21 04:44:51.334896 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.334868 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff22f569-478b-4a9f-b91c-f437a2794ab9-webhook-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-ws65x\" (UID: \"ff22f569-478b-4a9f-b91c-f437a2794ab9\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x" Apr 21 04:44:51.335044 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.334956 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff22f569-478b-4a9f-b91c-f437a2794ab9-apiservice-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-ws65x\" (UID: \"ff22f569-478b-4a9f-b91c-f437a2794ab9\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x" Apr 21 04:44:51.341818 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.341755 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjmdc\" (UniqueName: \"kubernetes.io/projected/ff22f569-478b-4a9f-b91c-f437a2794ab9-kube-api-access-sjmdc\") pod \"opendatahub-operator-controller-manager-55ddb68486-ws65x\" (UID: \"ff22f569-478b-4a9f-b91c-f437a2794ab9\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x" Apr 21 04:44:51.402669 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.402644 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x" Apr 21 04:44:51.518823 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:51.518795 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x"] Apr 21 04:44:51.522780 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:44:51.522751 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff22f569_478b_4a9f_b91c_f437a2794ab9.slice/crio-f5d409ba3a070827d9827ac24a4d2075f2d256e0b17f2956c5f11d39daf24cf2 WatchSource:0}: Error finding container f5d409ba3a070827d9827ac24a4d2075f2d256e0b17f2956c5f11d39daf24cf2: Status 404 returned error can't find the container with id f5d409ba3a070827d9827ac24a4d2075f2d256e0b17f2956c5f11d39daf24cf2 Apr 21 04:44:52.246736 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:52.246697 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x" event={"ID":"ff22f569-478b-4a9f-b91c-f437a2794ab9","Type":"ContainerStarted","Data":"f5d409ba3a070827d9827ac24a4d2075f2d256e0b17f2956c5f11d39daf24cf2"} Apr 21 04:44:54.253249 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:54.253219 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x" event={"ID":"ff22f569-478b-4a9f-b91c-f437a2794ab9","Type":"ContainerStarted","Data":"9bc2238914c015de58e207be20cd23dd88af832eb30a28eee395fa5c011df575"} Apr 21 04:44:54.253603 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:54.253339 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x" Apr 21 04:44:54.271347 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:44:54.271296 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x" podStartSLOduration=0.643949203 podStartE2EDuration="3.271281968s" podCreationTimestamp="2026-04-21 04:44:51 +0000 UTC" firstStartedPulling="2026-04-21 04:44:51.524433874 +0000 UTC m=+377.918795593" lastFinishedPulling="2026-04-21 04:44:54.151766636 +0000 UTC m=+380.546128358" observedRunningTime="2026-04-21 04:44:54.270857068 +0000 UTC m=+380.665218809" watchObservedRunningTime="2026-04-21 04:44:54.271281968 +0000 UTC m=+380.665643710" Apr 21 04:45:00.490655 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.490621 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2"] Apr 21 04:45:00.493886 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.493868 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" Apr 21 04:45:00.496619 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.496598 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 04:45:00.497643 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.497613 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 04:45:00.497753 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.497618 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 04:45:00.497753 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.497619 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 04:45:00.497753 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.497736 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:45:00.498653 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.498638 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-qnzdb\"" Apr 21 04:45:00.516290 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.516265 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2"] Apr 21 04:45:00.597661 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.597631 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkk9h\" (UniqueName: \"kubernetes.io/projected/2b951746-6366-4fed-90d8-dc82961114b2-kube-api-access-rkk9h\") pod \"lws-controller-manager-bb95b586d-j9bn2\" (UID: \"2b951746-6366-4fed-90d8-dc82961114b2\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" Apr 21 04:45:00.597661 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.597672 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b951746-6366-4fed-90d8-dc82961114b2-cert\") pod \"lws-controller-manager-bb95b586d-j9bn2\" (UID: \"2b951746-6366-4fed-90d8-dc82961114b2\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" Apr 21 04:45:00.597864 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.597735 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b951746-6366-4fed-90d8-dc82961114b2-metrics-cert\") pod \"lws-controller-manager-bb95b586d-j9bn2\" (UID: \"2b951746-6366-4fed-90d8-dc82961114b2\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" Apr 21 04:45:00.597864 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.597784 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2b951746-6366-4fed-90d8-dc82961114b2-manager-config\") pod \"lws-controller-manager-bb95b586d-j9bn2\" (UID: \"2b951746-6366-4fed-90d8-dc82961114b2\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" Apr 21 04:45:00.698539 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.698455 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b951746-6366-4fed-90d8-dc82961114b2-metrics-cert\") pod \"lws-controller-manager-bb95b586d-j9bn2\" (UID: \"2b951746-6366-4fed-90d8-dc82961114b2\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" Apr 21 04:45:00.698698 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.698619 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2b951746-6366-4fed-90d8-dc82961114b2-manager-config\") pod \"lws-controller-manager-bb95b586d-j9bn2\" (UID: \"2b951746-6366-4fed-90d8-dc82961114b2\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" Apr 21 04:45:00.698698 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.698664 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkk9h\" (UniqueName: \"kubernetes.io/projected/2b951746-6366-4fed-90d8-dc82961114b2-kube-api-access-rkk9h\") pod \"lws-controller-manager-bb95b586d-j9bn2\" (UID: \"2b951746-6366-4fed-90d8-dc82961114b2\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" Apr 21 04:45:00.698698 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.698695 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b951746-6366-4fed-90d8-dc82961114b2-cert\") pod \"lws-controller-manager-bb95b586d-j9bn2\" (UID: \"2b951746-6366-4fed-90d8-dc82961114b2\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" Apr 21 04:45:00.699349 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.699322 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2b951746-6366-4fed-90d8-dc82961114b2-manager-config\") pod \"lws-controller-manager-bb95b586d-j9bn2\" (UID: \"2b951746-6366-4fed-90d8-dc82961114b2\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" Apr 21 04:45:00.700846 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.700827 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b951746-6366-4fed-90d8-dc82961114b2-metrics-cert\") pod \"lws-controller-manager-bb95b586d-j9bn2\" (UID: \"2b951746-6366-4fed-90d8-dc82961114b2\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" Apr 21 04:45:00.700959 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.700934 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b951746-6366-4fed-90d8-dc82961114b2-cert\") pod \"lws-controller-manager-bb95b586d-j9bn2\" (UID: \"2b951746-6366-4fed-90d8-dc82961114b2\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" Apr 21 04:45:00.707043 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.707019 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkk9h\" (UniqueName: \"kubernetes.io/projected/2b951746-6366-4fed-90d8-dc82961114b2-kube-api-access-rkk9h\") pod \"lws-controller-manager-bb95b586d-j9bn2\" (UID: \"2b951746-6366-4fed-90d8-dc82961114b2\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" Apr 21 04:45:00.802990 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.802961 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" Apr 21 04:45:00.923954 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:00.923925 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2"] Apr 21 04:45:00.927201 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:45:00.927179 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b951746_6366_4fed_90d8_dc82961114b2.slice/crio-e30a11eb6ba65c3313db80acd6e2279731aa7104cb3a3ac5cfd7e5b4453ba069 WatchSource:0}: Error finding container e30a11eb6ba65c3313db80acd6e2279731aa7104cb3a3ac5cfd7e5b4453ba069: Status 404 returned error can't find the container with id e30a11eb6ba65c3313db80acd6e2279731aa7104cb3a3ac5cfd7e5b4453ba069 Apr 21 04:45:01.270414 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:01.270337 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" event={"ID":"2b951746-6366-4fed-90d8-dc82961114b2","Type":"ContainerStarted","Data":"e30a11eb6ba65c3313db80acd6e2279731aa7104cb3a3ac5cfd7e5b4453ba069"} Apr 21 04:45:03.277162 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:03.277133 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" event={"ID":"2b951746-6366-4fed-90d8-dc82961114b2","Type":"ContainerStarted","Data":"48dd49b7fea6f109d4d82e732fa2423f8f3db3b98fc254ac0c8274520379eb50"} Apr 21 04:45:03.277540 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:03.277188 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" Apr 21 04:45:03.296620 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:03.296574 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" podStartSLOduration=1.014724904 podStartE2EDuration="3.296561604s" podCreationTimestamp="2026-04-21 04:45:00 +0000 UTC" firstStartedPulling="2026-04-21 04:45:00.928888108 +0000 UTC m=+387.323249827" lastFinishedPulling="2026-04-21 04:45:03.210724803 +0000 UTC m=+389.605086527" observedRunningTime="2026-04-21 04:45:03.296250112 +0000 UTC m=+389.690611853" watchObservedRunningTime="2026-04-21 04:45:03.296561604 +0000 UTC m=+389.690923344" Apr 21 04:45:05.257947 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:05.257917 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-ws65x" Apr 21 04:45:14.283141 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:14.283113 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-bb95b586d-j9bn2" Apr 21 04:45:28.989029 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:28.988992 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv"] Apr 21 04:45:29.064106 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.064073 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv"] Apr 21 04:45:29.064257 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.064127 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.066839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.066823 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 04:45:29.066957 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.066856 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 04:45:29.066957 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.066858 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 04:45:29.067044 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.066823 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-l7ztz\"" Apr 21 04:45:29.098425 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.098404 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.098540 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.098441 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.098540 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.098460 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aaefc3d7-8065-4621-a896-47536e3037da-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.098540 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.098534 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/aaefc3d7-8065-4621-a896-47536e3037da-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.098657 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.098591 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv62w\" (UniqueName: \"kubernetes.io/projected/aaefc3d7-8065-4621-a896-47536e3037da-kube-api-access-sv62w\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.098657 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.098619 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.098657 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.098641 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.098758 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.098660 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.098758 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.098682 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/aaefc3d7-8065-4621-a896-47536e3037da-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.113055 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.113032 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz"] Apr 21 04:45:29.137898 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.137877 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz"] Apr 21 04:45:29.138020 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.137995 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.199849 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.199822 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.199988 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.199859 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/837f93dc-e520-4850-a838-f868fc265b37-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.199988 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.199882 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.199988 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.199901 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aaefc3d7-8065-4621-a896-47536e3037da-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.199988 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.199949 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/837f93dc-e520-4850-a838-f868fc265b37-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.199988 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.199974 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/837f93dc-e520-4850-a838-f868fc265b37-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.200241 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.200004 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/aaefc3d7-8065-4621-a896-47536e3037da-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.200241 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.200072 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/837f93dc-e520-4850-a838-f868fc265b37-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.200241 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.200117 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sv62w\" (UniqueName: \"kubernetes.io/projected/aaefc3d7-8065-4621-a896-47536e3037da-kube-api-access-sv62w\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.200241 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.200179 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95gwp\" (UniqueName: \"kubernetes.io/projected/837f93dc-e520-4850-a838-f868fc265b37-kube-api-access-95gwp\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.200241 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.200203 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.200241 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.200222 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.200621 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.200254 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/837f93dc-e520-4850-a838-f868fc265b37-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.200621 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.200287 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.200621 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.200317 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.200621 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.200355 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/837f93dc-e520-4850-a838-f868fc265b37-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.200621 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.200389 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/837f93dc-e520-4850-a838-f868fc265b37-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.200621 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.200408 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/aaefc3d7-8065-4621-a896-47536e3037da-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.200621 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.200441 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/837f93dc-e520-4850-a838-f868fc265b37-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.200621 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.200606 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.200621 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.200616 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.201073 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.200656 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/aaefc3d7-8065-4621-a896-47536e3037da-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.201073 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.200656 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.202169 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.202150 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.203039 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.202728 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/aaefc3d7-8065-4621-a896-47536e3037da-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.223509 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.223480 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aaefc3d7-8065-4621-a896-47536e3037da-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.227409 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.227388 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv62w\" (UniqueName: \"kubernetes.io/projected/aaefc3d7-8065-4621-a896-47536e3037da-kube-api-access-sv62w\") pod \"data-science-gateway-data-science-gateway-class-55fc86bb7724rjv\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.300997 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.300969 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/837f93dc-e520-4850-a838-f868fc265b37-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.301122 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.301005 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/837f93dc-e520-4850-a838-f868fc265b37-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.301122 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.301021 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/837f93dc-e520-4850-a838-f868fc265b37-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.301122 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.301045 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/837f93dc-e520-4850-a838-f868fc265b37-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.301122 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.301070 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95gwp\" (UniqueName: \"kubernetes.io/projected/837f93dc-e520-4850-a838-f868fc265b37-kube-api-access-95gwp\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.301122 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.301089 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/837f93dc-e520-4850-a838-f868fc265b37-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.301122 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.301108 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/837f93dc-e520-4850-a838-f868fc265b37-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.301443 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.301126 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/837f93dc-e520-4850-a838-f868fc265b37-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.301443 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.301149 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/837f93dc-e520-4850-a838-f868fc265b37-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.301583 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.301436 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/837f93dc-e520-4850-a838-f868fc265b37-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.301782 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.301754 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/837f93dc-e520-4850-a838-f868fc265b37-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.301847 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.301779 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/837f93dc-e520-4850-a838-f868fc265b37-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.301885 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.301857 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/837f93dc-e520-4850-a838-f868fc265b37-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.301918 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.301855 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/837f93dc-e520-4850-a838-f868fc265b37-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.303101 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.303079 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/837f93dc-e520-4850-a838-f868fc265b37-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.303406 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.303387 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/837f93dc-e520-4850-a838-f868fc265b37-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.310293 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.310274 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/837f93dc-e520-4850-a838-f868fc265b37-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.311602 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.311582 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95gwp\" (UniqueName: \"kubernetes.io/projected/837f93dc-e520-4850-a838-f868fc265b37-kube-api-access-95gwp\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f7gchz\" (UID: \"837f93dc-e520-4850-a838-f868fc265b37\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.374276 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.374249 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:29.450365 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.450268 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:29.491674 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.491635 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv"] Apr 21 04:45:29.494647 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:45:29.494617 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaefc3d7_8065_4621_a896_47536e3037da.slice/crio-bfcc6d8a2a7f433be053c90517a44bdb408d945a4c71c598f0f25c918bc709f8 WatchSource:0}: Error finding container bfcc6d8a2a7f433be053c90517a44bdb408d945a4c71c598f0f25c918bc709f8: Status 404 returned error can't find the container with id bfcc6d8a2a7f433be053c90517a44bdb408d945a4c71c598f0f25c918bc709f8 Apr 21 04:45:29.573467 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:29.573413 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz"] Apr 21 04:45:29.575415 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:45:29.575389 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod837f93dc_e520_4850_a838_f868fc265b37.slice/crio-f658b05c5318e4bfc9e81c70ced9debd3b138656918a7fd1ab47254287c24fd7 WatchSource:0}: Error finding container f658b05c5318e4bfc9e81c70ced9debd3b138656918a7fd1ab47254287c24fd7: Status 404 returned error can't find the container with id f658b05c5318e4bfc9e81c70ced9debd3b138656918a7fd1ab47254287c24fd7 Apr 21 04:45:30.348480 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:30.348438 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" event={"ID":"837f93dc-e520-4850-a838-f868fc265b37","Type":"ContainerStarted","Data":"f658b05c5318e4bfc9e81c70ced9debd3b138656918a7fd1ab47254287c24fd7"} Apr 21 04:45:30.350849 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:30.350815 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" event={"ID":"aaefc3d7-8065-4621-a896-47536e3037da","Type":"ContainerStarted","Data":"bfcc6d8a2a7f433be053c90517a44bdb408d945a4c71c598f0f25c918bc709f8"} Apr 21 04:45:31.986297 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:31.986253 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 04:45:31.986718 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:31.986340 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 04:45:31.986718 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:31.986384 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 04:45:31.992384 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:31.992350 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 04:45:31.992488 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:31.992423 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 04:45:31.992488 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:31.992463 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 04:45:32.359162 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:32.359128 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" event={"ID":"837f93dc-e520-4850-a838-f868fc265b37","Type":"ContainerStarted","Data":"fea25699cfcc67ab7e35826b86fe9a9f27373a76d59fb322dd13c6365bc2dc1a"} Apr 21 04:45:32.360479 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:32.360456 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" event={"ID":"aaefc3d7-8065-4621-a896-47536e3037da","Type":"ContainerStarted","Data":"13fef621183149cb0e3959cef0aec33b32bd2697ceca501d6efb7d8087152365"} Apr 21 04:45:32.375028 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:32.375007 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:32.377715 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:32.377673 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" podStartSLOduration=0.962855099 podStartE2EDuration="3.377661186s" podCreationTimestamp="2026-04-21 04:45:29 +0000 UTC" firstStartedPulling="2026-04-21 04:45:29.577312953 +0000 UTC m=+415.971674670" lastFinishedPulling="2026-04-21 04:45:31.992119038 +0000 UTC m=+418.386480757" observedRunningTime="2026-04-21 04:45:32.376731161 +0000 UTC m=+418.771092901" watchObservedRunningTime="2026-04-21 04:45:32.377661186 +0000 UTC m=+418.772022925" Apr 21 04:45:32.394437 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:32.394393 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" podStartSLOduration=1.905226321 podStartE2EDuration="4.394382204s" podCreationTimestamp="2026-04-21 04:45:28 +0000 UTC" firstStartedPulling="2026-04-21 04:45:29.496874112 +0000 UTC m=+415.891235837" lastFinishedPulling="2026-04-21 04:45:31.986030003 +0000 UTC m=+418.380391720" observedRunningTime="2026-04-21 04:45:32.393479531 +0000 UTC m=+418.787841271" watchObservedRunningTime="2026-04-21 04:45:32.394382204 +0000 UTC m=+418.788743943" Apr 21 04:45:32.450512 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:32.450475 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:33.376504 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:33.376464 2570 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.132.0.16:15021/healthz/ready\": dial tcp 10.132.0.16:15021: connect: connection refused" start-of-body= Apr 21 04:45:33.376850 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:33.376529 2570 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" podUID="aaefc3d7-8065-4621-a896-47536e3037da" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.132.0.16:15021/healthz/ready\": dial tcp 10.132.0.16:15021: connect: connection refused" Apr 21 04:45:33.455564 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:33.455537 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:34.365261 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:34.365220 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:34.366020 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:34.366002 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f7gchz" Apr 21 04:45:34.375480 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:34.375458 2570 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.132.0.16:15021/healthz/ready\": dial tcp 10.132.0.16:15021: connect: connection refused" start-of-body= Apr 21 04:45:34.375563 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:34.375519 2570 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" podUID="aaefc3d7-8065-4621-a896-47536e3037da" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.132.0.16:15021/healthz/ready\": dial tcp 10.132.0.16:15021: connect: connection refused" Apr 21 04:45:34.417277 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:34.417244 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv"] Apr 21 04:45:35.367900 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:35.367862 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" podUID="aaefc3d7-8065-4621-a896-47536e3037da" containerName="istio-proxy" containerID="cri-o://13fef621183149cb0e3959cef0aec33b32bd2697ceca501d6efb7d8087152365" gracePeriod=30 Apr 21 04:45:40.602248 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.602225 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:40.674489 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.674413 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-istio-envoy\") pod \"aaefc3d7-8065-4621-a896-47536e3037da\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " Apr 21 04:45:40.674489 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.674443 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-workload-certs\") pod \"aaefc3d7-8065-4621-a896-47536e3037da\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " Apr 21 04:45:40.674489 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.674479 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv62w\" (UniqueName: \"kubernetes.io/projected/aaefc3d7-8065-4621-a896-47536e3037da-kube-api-access-sv62w\") pod \"aaefc3d7-8065-4621-a896-47536e3037da\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " Apr 21 04:45:40.674720 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.674523 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-workload-socket\") pod \"aaefc3d7-8065-4621-a896-47536e3037da\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " Apr 21 04:45:40.674720 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.674545 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/aaefc3d7-8065-4621-a896-47536e3037da-istio-podinfo\") pod \"aaefc3d7-8065-4621-a896-47536e3037da\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " Apr 21 04:45:40.674720 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.674561 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aaefc3d7-8065-4621-a896-47536e3037da-istio-token\") pod \"aaefc3d7-8065-4621-a896-47536e3037da\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " Apr 21 04:45:40.674720 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.674578 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-istio-data\") pod \"aaefc3d7-8065-4621-a896-47536e3037da\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " Apr 21 04:45:40.674720 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.674593 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/aaefc3d7-8065-4621-a896-47536e3037da-istiod-ca-cert\") pod \"aaefc3d7-8065-4621-a896-47536e3037da\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " Apr 21 04:45:40.674720 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.674615 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-credential-socket\") pod \"aaefc3d7-8065-4621-a896-47536e3037da\" (UID: \"aaefc3d7-8065-4621-a896-47536e3037da\") " Apr 21 04:45:40.679809 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.679779 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-istio-data" (OuterVolumeSpecName: "istio-data") pod "aaefc3d7-8065-4621-a896-47536e3037da" (UID: "aaefc3d7-8065-4621-a896-47536e3037da"). InnerVolumeSpecName "istio-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:45:40.679809 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.679789 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-workload-socket" (OuterVolumeSpecName: "workload-socket") pod "aaefc3d7-8065-4621-a896-47536e3037da" (UID: "aaefc3d7-8065-4621-a896-47536e3037da"). InnerVolumeSpecName "workload-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:45:40.680067 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.679805 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-credential-socket" (OuterVolumeSpecName: "credential-socket") pod "aaefc3d7-8065-4621-a896-47536e3037da" (UID: "aaefc3d7-8065-4621-a896-47536e3037da"). InnerVolumeSpecName "credential-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:45:40.680067 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.679921 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-workload-certs" (OuterVolumeSpecName: "workload-certs") pod "aaefc3d7-8065-4621-a896-47536e3037da" (UID: "aaefc3d7-8065-4621-a896-47536e3037da"). InnerVolumeSpecName "workload-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:45:40.680067 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.679940 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaefc3d7-8065-4621-a896-47536e3037da-istiod-ca-cert" (OuterVolumeSpecName: "istiod-ca-cert") pod "aaefc3d7-8065-4621-a896-47536e3037da" (UID: "aaefc3d7-8065-4621-a896-47536e3037da"). InnerVolumeSpecName "istiod-ca-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:45:40.681735 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.681710 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-istio-envoy" (OuterVolumeSpecName: "istio-envoy") pod "aaefc3d7-8065-4621-a896-47536e3037da" (UID: "aaefc3d7-8065-4621-a896-47536e3037da"). InnerVolumeSpecName "istio-envoy". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:45:40.681837 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.681759 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/aaefc3d7-8065-4621-a896-47536e3037da-istio-podinfo" (OuterVolumeSpecName: "istio-podinfo") pod "aaefc3d7-8065-4621-a896-47536e3037da" (UID: "aaefc3d7-8065-4621-a896-47536e3037da"). InnerVolumeSpecName "istio-podinfo". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Apr 21 04:45:40.682008 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.681994 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaefc3d7-8065-4621-a896-47536e3037da-kube-api-access-sv62w" (OuterVolumeSpecName: "kube-api-access-sv62w") pod "aaefc3d7-8065-4621-a896-47536e3037da" (UID: "aaefc3d7-8065-4621-a896-47536e3037da"). InnerVolumeSpecName "kube-api-access-sv62w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:45:40.682052 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.682006 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaefc3d7-8065-4621-a896-47536e3037da-istio-token" (OuterVolumeSpecName: "istio-token") pod "aaefc3d7-8065-4621-a896-47536e3037da" (UID: "aaefc3d7-8065-4621-a896-47536e3037da"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:45:40.776066 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.776037 2570 reconciler_common.go:299] "Volume detached for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-workload-certs\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:45:40.776066 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.776064 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sv62w\" (UniqueName: \"kubernetes.io/projected/aaefc3d7-8065-4621-a896-47536e3037da-kube-api-access-sv62w\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:45:40.776209 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.776075 2570 reconciler_common.go:299] "Volume detached for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-workload-socket\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:45:40.776209 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.776085 2570 reconciler_common.go:299] "Volume detached for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/aaefc3d7-8065-4621-a896-47536e3037da-istio-podinfo\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:45:40.776209 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.776093 2570 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aaefc3d7-8065-4621-a896-47536e3037da-istio-token\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:45:40.776209 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.776101 2570 reconciler_common.go:299] "Volume detached for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-istio-data\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:45:40.776209 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.776108 2570 reconciler_common.go:299] "Volume detached for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/aaefc3d7-8065-4621-a896-47536e3037da-istiod-ca-cert\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:45:40.776209 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.776116 2570 reconciler_common.go:299] "Volume detached for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-credential-socket\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:45:40.776209 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:40.776128 2570 reconciler_common.go:299] "Volume detached for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/aaefc3d7-8065-4621-a896-47536e3037da-istio-envoy\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:45:41.384295 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:41.384266 2570 generic.go:358] "Generic (PLEG): container finished" podID="aaefc3d7-8065-4621-a896-47536e3037da" containerID="13fef621183149cb0e3959cef0aec33b32bd2697ceca501d6efb7d8087152365" exitCode=0 Apr 21 04:45:41.384459 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:41.384323 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" Apr 21 04:45:41.384459 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:41.384342 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" event={"ID":"aaefc3d7-8065-4621-a896-47536e3037da","Type":"ContainerDied","Data":"13fef621183149cb0e3959cef0aec33b32bd2697ceca501d6efb7d8087152365"} Apr 21 04:45:41.384459 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:41.384379 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv" event={"ID":"aaefc3d7-8065-4621-a896-47536e3037da","Type":"ContainerDied","Data":"bfcc6d8a2a7f433be053c90517a44bdb408d945a4c71c598f0f25c918bc709f8"} Apr 21 04:45:41.384459 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:41.384395 2570 scope.go:117] "RemoveContainer" containerID="13fef621183149cb0e3959cef0aec33b32bd2697ceca501d6efb7d8087152365" Apr 21 04:45:41.392629 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:41.392553 2570 scope.go:117] "RemoveContainer" containerID="13fef621183149cb0e3959cef0aec33b32bd2697ceca501d6efb7d8087152365" Apr 21 04:45:41.392818 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:45:41.392800 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13fef621183149cb0e3959cef0aec33b32bd2697ceca501d6efb7d8087152365\": container with ID starting with 13fef621183149cb0e3959cef0aec33b32bd2697ceca501d6efb7d8087152365 not found: ID does not exist" containerID="13fef621183149cb0e3959cef0aec33b32bd2697ceca501d6efb7d8087152365" Apr 21 04:45:41.392876 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:41.392831 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13fef621183149cb0e3959cef0aec33b32bd2697ceca501d6efb7d8087152365"} err="failed to get container status \"13fef621183149cb0e3959cef0aec33b32bd2697ceca501d6efb7d8087152365\": rpc error: code = NotFound desc = could not find container \"13fef621183149cb0e3959cef0aec33b32bd2697ceca501d6efb7d8087152365\": container with ID starting with 13fef621183149cb0e3959cef0aec33b32bd2697ceca501d6efb7d8087152365 not found: ID does not exist" Apr 21 04:45:41.405838 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:41.405817 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv"] Apr 21 04:45:41.409852 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:41.409832 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55fc86bb7724rjv"] Apr 21 04:45:42.219287 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:45:42.219257 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaefc3d7-8065-4621-a896-47536e3037da" path="/var/lib/kubelet/pods/aaefc3d7-8065-4621-a896-47536e3037da/volumes" Apr 21 04:46:09.691061 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:09.690980 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-r4tw7"] Apr 21 04:46:09.691546 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:09.691256 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aaefc3d7-8065-4621-a896-47536e3037da" containerName="istio-proxy" Apr 21 04:46:09.691546 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:09.691268 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaefc3d7-8065-4621-a896-47536e3037da" containerName="istio-proxy" Apr 21 04:46:09.691546 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:09.691321 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="aaefc3d7-8065-4621-a896-47536e3037da" containerName="istio-proxy" Apr 21 04:46:09.701122 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:09.701100 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-r4tw7" Apr 21 04:46:09.701690 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:09.701655 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-r4tw7"] Apr 21 04:46:09.703648 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:09.703615 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 04:46:09.704610 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:09.704588 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-h7dgm\"" Apr 21 04:46:09.704710 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:09.704659 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 04:46:09.770532 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:09.770489 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmsdt\" (UniqueName: \"kubernetes.io/projected/9adfc2db-a8e9-4c47-9cc5-f02bbb502de4-kube-api-access-nmsdt\") pod \"kuadrant-operator-catalog-r4tw7\" (UID: \"9adfc2db-a8e9-4c47-9cc5-f02bbb502de4\") " pod="kuadrant-system/kuadrant-operator-catalog-r4tw7" Apr 21 04:46:09.871759 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:09.871714 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmsdt\" (UniqueName: \"kubernetes.io/projected/9adfc2db-a8e9-4c47-9cc5-f02bbb502de4-kube-api-access-nmsdt\") pod \"kuadrant-operator-catalog-r4tw7\" (UID: \"9adfc2db-a8e9-4c47-9cc5-f02bbb502de4\") " pod="kuadrant-system/kuadrant-operator-catalog-r4tw7" Apr 21 04:46:09.879978 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:09.879947 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmsdt\" (UniqueName: \"kubernetes.io/projected/9adfc2db-a8e9-4c47-9cc5-f02bbb502de4-kube-api-access-nmsdt\") pod \"kuadrant-operator-catalog-r4tw7\" (UID: \"9adfc2db-a8e9-4c47-9cc5-f02bbb502de4\") " pod="kuadrant-system/kuadrant-operator-catalog-r4tw7" Apr 21 04:46:10.011913 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:10.011822 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-r4tw7" Apr 21 04:46:10.061742 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:10.061697 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-r4tw7"] Apr 21 04:46:10.124053 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:10.124018 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-r4tw7"] Apr 21 04:46:10.128279 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:46:10.128248 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9adfc2db_a8e9_4c47_9cc5_f02bbb502de4.slice/crio-5d921bcc2e9e7d01ad90fac477df7b1960e80fa7cb70eec33f95d4c361581776 WatchSource:0}: Error finding container 5d921bcc2e9e7d01ad90fac477df7b1960e80fa7cb70eec33f95d4c361581776: Status 404 returned error can't find the container with id 5d921bcc2e9e7d01ad90fac477df7b1960e80fa7cb70eec33f95d4c361581776 Apr 21 04:46:10.277193 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:10.277123 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-6zvhn"] Apr 21 04:46:10.281913 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:10.281896 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-6zvhn" Apr 21 04:46:10.287208 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:10.287181 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-6zvhn"] Apr 21 04:46:10.375227 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:10.375197 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfk2d\" (UniqueName: \"kubernetes.io/projected/8d587dde-69cc-4333-8982-70e6e4ceb562-kube-api-access-mfk2d\") pod \"kuadrant-operator-catalog-6zvhn\" (UID: \"8d587dde-69cc-4333-8982-70e6e4ceb562\") " pod="kuadrant-system/kuadrant-operator-catalog-6zvhn" Apr 21 04:46:10.459218 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:10.459182 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-r4tw7" event={"ID":"9adfc2db-a8e9-4c47-9cc5-f02bbb502de4","Type":"ContainerStarted","Data":"5d921bcc2e9e7d01ad90fac477df7b1960e80fa7cb70eec33f95d4c361581776"} Apr 21 04:46:10.476112 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:10.476089 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfk2d\" (UniqueName: \"kubernetes.io/projected/8d587dde-69cc-4333-8982-70e6e4ceb562-kube-api-access-mfk2d\") pod \"kuadrant-operator-catalog-6zvhn\" (UID: \"8d587dde-69cc-4333-8982-70e6e4ceb562\") " pod="kuadrant-system/kuadrant-operator-catalog-6zvhn" Apr 21 04:46:10.484155 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:10.484111 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfk2d\" (UniqueName: \"kubernetes.io/projected/8d587dde-69cc-4333-8982-70e6e4ceb562-kube-api-access-mfk2d\") pod \"kuadrant-operator-catalog-6zvhn\" (UID: \"8d587dde-69cc-4333-8982-70e6e4ceb562\") " pod="kuadrant-system/kuadrant-operator-catalog-6zvhn" Apr 21 04:46:10.591484 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:10.591446 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-6zvhn" Apr 21 04:46:10.706795 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:10.706767 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-6zvhn"] Apr 21 04:46:10.735079 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:46:10.735051 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d587dde_69cc_4333_8982_70e6e4ceb562.slice/crio-17389bbba0eacce03575e14c7978718f5a1543fe7c26ab7ab787724674988e6c WatchSource:0}: Error finding container 17389bbba0eacce03575e14c7978718f5a1543fe7c26ab7ab787724674988e6c: Status 404 returned error can't find the container with id 17389bbba0eacce03575e14c7978718f5a1543fe7c26ab7ab787724674988e6c Apr 21 04:46:11.464516 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:11.464415 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-6zvhn" event={"ID":"8d587dde-69cc-4333-8982-70e6e4ceb562","Type":"ContainerStarted","Data":"17389bbba0eacce03575e14c7978718f5a1543fe7c26ab7ab787724674988e6c"} Apr 21 04:46:12.468754 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:12.468669 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-r4tw7" event={"ID":"9adfc2db-a8e9-4c47-9cc5-f02bbb502de4","Type":"ContainerStarted","Data":"71fbdf48a9e7317526348b003aebf0f49d14df46aae25b8aa88937aa02c46b73"} Apr 21 04:46:12.469157 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:12.468781 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-r4tw7" podUID="9adfc2db-a8e9-4c47-9cc5-f02bbb502de4" containerName="registry-server" containerID="cri-o://71fbdf48a9e7317526348b003aebf0f49d14df46aae25b8aa88937aa02c46b73" gracePeriod=2 Apr 21 04:46:12.470159 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:12.470129 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-6zvhn" event={"ID":"8d587dde-69cc-4333-8982-70e6e4ceb562","Type":"ContainerStarted","Data":"d7f3d07bb406da91591618fb7969662070048333cd60f2e722c88efb3d03cbe3"} Apr 21 04:46:12.484090 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:12.483526 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-r4tw7" podStartSLOduration=1.401511834 podStartE2EDuration="3.483510368s" podCreationTimestamp="2026-04-21 04:46:09 +0000 UTC" firstStartedPulling="2026-04-21 04:46:10.129959624 +0000 UTC m=+456.524321342" lastFinishedPulling="2026-04-21 04:46:12.211958151 +0000 UTC m=+458.606319876" observedRunningTime="2026-04-21 04:46:12.483360872 +0000 UTC m=+458.877722612" watchObservedRunningTime="2026-04-21 04:46:12.483510368 +0000 UTC m=+458.877872106" Apr 21 04:46:12.501179 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:12.501140 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-6zvhn" podStartSLOduration=1.022231155 podStartE2EDuration="2.501128677s" podCreationTimestamp="2026-04-21 04:46:10 +0000 UTC" firstStartedPulling="2026-04-21 04:46:10.736372698 +0000 UTC m=+457.130734416" lastFinishedPulling="2026-04-21 04:46:12.215270221 +0000 UTC m=+458.609631938" observedRunningTime="2026-04-21 04:46:12.499664049 +0000 UTC m=+458.894025790" watchObservedRunningTime="2026-04-21 04:46:12.501128677 +0000 UTC m=+458.895490416" Apr 21 04:46:12.695638 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:12.695616 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-r4tw7" Apr 21 04:46:12.795629 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:12.795603 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmsdt\" (UniqueName: \"kubernetes.io/projected/9adfc2db-a8e9-4c47-9cc5-f02bbb502de4-kube-api-access-nmsdt\") pod \"9adfc2db-a8e9-4c47-9cc5-f02bbb502de4\" (UID: \"9adfc2db-a8e9-4c47-9cc5-f02bbb502de4\") " Apr 21 04:46:12.797703 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:12.797677 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9adfc2db-a8e9-4c47-9cc5-f02bbb502de4-kube-api-access-nmsdt" (OuterVolumeSpecName: "kube-api-access-nmsdt") pod "9adfc2db-a8e9-4c47-9cc5-f02bbb502de4" (UID: "9adfc2db-a8e9-4c47-9cc5-f02bbb502de4"). InnerVolumeSpecName "kube-api-access-nmsdt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:46:12.897141 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:12.897108 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nmsdt\" (UniqueName: \"kubernetes.io/projected/9adfc2db-a8e9-4c47-9cc5-f02bbb502de4-kube-api-access-nmsdt\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:46:13.474183 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:13.474149 2570 generic.go:358] "Generic (PLEG): container finished" podID="9adfc2db-a8e9-4c47-9cc5-f02bbb502de4" containerID="71fbdf48a9e7317526348b003aebf0f49d14df46aae25b8aa88937aa02c46b73" exitCode=0 Apr 21 04:46:13.474578 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:13.474205 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-r4tw7" Apr 21 04:46:13.474578 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:13.474241 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-r4tw7" event={"ID":"9adfc2db-a8e9-4c47-9cc5-f02bbb502de4","Type":"ContainerDied","Data":"71fbdf48a9e7317526348b003aebf0f49d14df46aae25b8aa88937aa02c46b73"} Apr 21 04:46:13.474578 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:13.474279 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-r4tw7" event={"ID":"9adfc2db-a8e9-4c47-9cc5-f02bbb502de4","Type":"ContainerDied","Data":"5d921bcc2e9e7d01ad90fac477df7b1960e80fa7cb70eec33f95d4c361581776"} Apr 21 04:46:13.474578 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:13.474296 2570 scope.go:117] "RemoveContainer" containerID="71fbdf48a9e7317526348b003aebf0f49d14df46aae25b8aa88937aa02c46b73" Apr 21 04:46:13.482345 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:13.482328 2570 scope.go:117] "RemoveContainer" containerID="71fbdf48a9e7317526348b003aebf0f49d14df46aae25b8aa88937aa02c46b73" Apr 21 04:46:13.482618 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:46:13.482599 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71fbdf48a9e7317526348b003aebf0f49d14df46aae25b8aa88937aa02c46b73\": container with ID starting with 71fbdf48a9e7317526348b003aebf0f49d14df46aae25b8aa88937aa02c46b73 not found: ID does not exist" containerID="71fbdf48a9e7317526348b003aebf0f49d14df46aae25b8aa88937aa02c46b73" Apr 21 04:46:13.482671 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:13.482625 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71fbdf48a9e7317526348b003aebf0f49d14df46aae25b8aa88937aa02c46b73"} err="failed to get container status \"71fbdf48a9e7317526348b003aebf0f49d14df46aae25b8aa88937aa02c46b73\": rpc error: code = NotFound desc = could not find container \"71fbdf48a9e7317526348b003aebf0f49d14df46aae25b8aa88937aa02c46b73\": container with ID starting with 71fbdf48a9e7317526348b003aebf0f49d14df46aae25b8aa88937aa02c46b73 not found: ID does not exist" Apr 21 04:46:13.493903 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:13.493879 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-r4tw7"] Apr 21 04:46:13.495593 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:13.495572 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-r4tw7"] Apr 21 04:46:14.220269 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:14.220235 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9adfc2db-a8e9-4c47-9cc5-f02bbb502de4" path="/var/lib/kubelet/pods/9adfc2db-a8e9-4c47-9cc5-f02bbb502de4/volumes" Apr 21 04:46:20.591792 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:20.591756 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-6zvhn" Apr 21 04:46:20.591792 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:20.591798 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-6zvhn" Apr 21 04:46:20.612456 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:20.612429 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-6zvhn" Apr 21 04:46:21.521191 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:21.521163 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-6zvhn" Apr 21 04:46:43.518723 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:43.518690 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g"] Apr 21 04:46:43.519177 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:43.518938 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9adfc2db-a8e9-4c47-9cc5-f02bbb502de4" containerName="registry-server" Apr 21 04:46:43.519177 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:43.518954 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9adfc2db-a8e9-4c47-9cc5-f02bbb502de4" containerName="registry-server" Apr 21 04:46:43.519177 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:43.519015 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="9adfc2db-a8e9-4c47-9cc5-f02bbb502de4" containerName="registry-server" Apr 21 04:46:43.526458 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:43.526436 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" Apr 21 04:46:43.529215 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:43.529191 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-rxf4s\"" Apr 21 04:46:43.540260 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:43.540239 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g"] Apr 21 04:46:43.613892 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:43.613869 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqrmj\" (UniqueName: \"kubernetes.io/projected/a326b59c-7538-458d-867d-4803904a0154-kube-api-access-wqrmj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-r947g\" (UID: \"a326b59c-7538-458d-867d-4803904a0154\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" Apr 21 04:46:43.614006 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:43.613927 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a326b59c-7538-458d-867d-4803904a0154-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-r947g\" (UID: \"a326b59c-7538-458d-867d-4803904a0154\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" Apr 21 04:46:43.715193 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:43.715167 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a326b59c-7538-458d-867d-4803904a0154-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-r947g\" (UID: \"a326b59c-7538-458d-867d-4803904a0154\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" Apr 21 04:46:43.715317 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:43.715215 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqrmj\" (UniqueName: \"kubernetes.io/projected/a326b59c-7538-458d-867d-4803904a0154-kube-api-access-wqrmj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-r947g\" (UID: \"a326b59c-7538-458d-867d-4803904a0154\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" Apr 21 04:46:43.715536 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:43.715520 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a326b59c-7538-458d-867d-4803904a0154-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-r947g\" (UID: \"a326b59c-7538-458d-867d-4803904a0154\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" Apr 21 04:46:43.725959 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:43.725939 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqrmj\" (UniqueName: \"kubernetes.io/projected/a326b59c-7538-458d-867d-4803904a0154-kube-api-access-wqrmj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-r947g\" (UID: \"a326b59c-7538-458d-867d-4803904a0154\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" Apr 21 04:46:43.836054 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:43.836025 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" Apr 21 04:46:43.968513 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:43.968424 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g"] Apr 21 04:46:43.972479 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:46:43.972448 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda326b59c_7538_458d_867d_4803904a0154.slice/crio-aa8ae1fe8d87adcddf305b5a619d36f1da5269e9ac2111a169e1cd5b61ee74b9 WatchSource:0}: Error finding container aa8ae1fe8d87adcddf305b5a619d36f1da5269e9ac2111a169e1cd5b61ee74b9: Status 404 returned error can't find the container with id aa8ae1fe8d87adcddf305b5a619d36f1da5269e9ac2111a169e1cd5b61ee74b9 Apr 21 04:46:44.562965 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:44.562926 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" event={"ID":"a326b59c-7538-458d-867d-4803904a0154","Type":"ContainerStarted","Data":"aa8ae1fe8d87adcddf305b5a619d36f1da5269e9ac2111a169e1cd5b61ee74b9"} Apr 21 04:46:49.580548 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:49.580508 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" event={"ID":"a326b59c-7538-458d-867d-4803904a0154","Type":"ContainerStarted","Data":"371ee4577287a81fcd2ec0ed04e73704c2ac670be407caf43a1cb08795b273a0"} Apr 21 04:46:49.580934 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:49.580809 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" Apr 21 04:46:49.603619 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:49.603570 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" podStartSLOduration=1.428390013 podStartE2EDuration="6.603555885s" podCreationTimestamp="2026-04-21 04:46:43 +0000 UTC" firstStartedPulling="2026-04-21 04:46:43.974932209 +0000 UTC m=+490.369293931" lastFinishedPulling="2026-04-21 04:46:49.150098082 +0000 UTC m=+495.544459803" observedRunningTime="2026-04-21 04:46:49.602347234 +0000 UTC m=+495.996708973" watchObservedRunningTime="2026-04-21 04:46:49.603555885 +0000 UTC m=+495.997917626" Apr 21 04:46:51.924478 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:51.924441 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm"] Apr 21 04:46:51.927849 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:51.927829 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm" Apr 21 04:46:51.932584 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:51.932564 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 21 04:46:51.932839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:51.932815 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 21 04:46:51.933053 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:51.933038 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-pgtpf\"" Apr 21 04:46:51.944837 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:51.944816 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm"] Apr 21 04:46:52.077277 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.077244 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b3dc471-0d48-47e9-a29a-d7e0617bc84d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-bfnvm\" (UID: \"5b3dc471-0d48-47e9-a29a-d7e0617bc84d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm" Apr 21 04:46:52.077435 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.077311 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj2w9\" (UniqueName: \"kubernetes.io/projected/5b3dc471-0d48-47e9-a29a-d7e0617bc84d-kube-api-access-mj2w9\") pod \"kuadrant-console-plugin-6cb54b5c86-bfnvm\" (UID: \"5b3dc471-0d48-47e9-a29a-d7e0617bc84d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm" Apr 21 04:46:52.077435 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.077352 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5b3dc471-0d48-47e9-a29a-d7e0617bc84d-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-bfnvm\" (UID: \"5b3dc471-0d48-47e9-a29a-d7e0617bc84d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm" Apr 21 04:46:52.178220 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.178141 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5b3dc471-0d48-47e9-a29a-d7e0617bc84d-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-bfnvm\" (UID: \"5b3dc471-0d48-47e9-a29a-d7e0617bc84d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm" Apr 21 04:46:52.178220 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.178184 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b3dc471-0d48-47e9-a29a-d7e0617bc84d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-bfnvm\" (UID: \"5b3dc471-0d48-47e9-a29a-d7e0617bc84d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm" Apr 21 04:46:52.178393 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.178225 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mj2w9\" (UniqueName: \"kubernetes.io/projected/5b3dc471-0d48-47e9-a29a-d7e0617bc84d-kube-api-access-mj2w9\") pod \"kuadrant-console-plugin-6cb54b5c86-bfnvm\" (UID: \"5b3dc471-0d48-47e9-a29a-d7e0617bc84d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm" Apr 21 04:46:52.178393 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:46:52.178348 2570 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 21 04:46:52.178456 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:46:52.178422 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b3dc471-0d48-47e9-a29a-d7e0617bc84d-plugin-serving-cert podName:5b3dc471-0d48-47e9-a29a-d7e0617bc84d nodeName:}" failed. No retries permitted until 2026-04-21 04:46:52.678399085 +0000 UTC m=+499.072760804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/5b3dc471-0d48-47e9-a29a-d7e0617bc84d-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-bfnvm" (UID: "5b3dc471-0d48-47e9-a29a-d7e0617bc84d") : secret "plugin-serving-cert" not found Apr 21 04:46:52.178893 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.178875 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5b3dc471-0d48-47e9-a29a-d7e0617bc84d-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-bfnvm\" (UID: \"5b3dc471-0d48-47e9-a29a-d7e0617bc84d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm" Apr 21 04:46:52.200457 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.200432 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj2w9\" (UniqueName: \"kubernetes.io/projected/5b3dc471-0d48-47e9-a29a-d7e0617bc84d-kube-api-access-mj2w9\") pod \"kuadrant-console-plugin-6cb54b5c86-bfnvm\" (UID: \"5b3dc471-0d48-47e9-a29a-d7e0617bc84d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm" Apr 21 04:46:52.681469 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.681416 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b3dc471-0d48-47e9-a29a-d7e0617bc84d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-bfnvm\" (UID: \"5b3dc471-0d48-47e9-a29a-d7e0617bc84d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm" Apr 21 04:46:52.683803 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.683785 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b3dc471-0d48-47e9-a29a-d7e0617bc84d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-bfnvm\" (UID: \"5b3dc471-0d48-47e9-a29a-d7e0617bc84d\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm" Apr 21 04:46:52.748344 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.748307 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd"] Apr 21 04:46:52.754562 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.754531 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" Apr 21 04:46:52.756981 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.756964 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-nflgw\"" Apr 21 04:46:52.763543 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.763519 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd"] Apr 21 04:46:52.837281 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.837257 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm" Apr 21 04:46:52.883476 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.883345 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k6sl\" (UniqueName: \"kubernetes.io/projected/6dded86f-c420-4864-a14f-3d349558ddf4-kube-api-access-7k6sl\") pod \"limitador-operator-controller-manager-85c4996f8c-q4vrd\" (UID: \"6dded86f-c420-4864-a14f-3d349558ddf4\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" Apr 21 04:46:52.953698 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.953566 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm"] Apr 21 04:46:52.956274 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:46:52.956246 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b3dc471_0d48_47e9_a29a_d7e0617bc84d.slice/crio-bac73c3d18641d0baf857e6ac47841e1088ef17f6299f5c7033f5e543bcd9a9a WatchSource:0}: Error finding container bac73c3d18641d0baf857e6ac47841e1088ef17f6299f5c7033f5e543bcd9a9a: Status 404 returned error can't find the container with id bac73c3d18641d0baf857e6ac47841e1088ef17f6299f5c7033f5e543bcd9a9a Apr 21 04:46:52.983967 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.983941 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7k6sl\" (UniqueName: \"kubernetes.io/projected/6dded86f-c420-4864-a14f-3d349558ddf4-kube-api-access-7k6sl\") pod \"limitador-operator-controller-manager-85c4996f8c-q4vrd\" (UID: \"6dded86f-c420-4864-a14f-3d349558ddf4\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" Apr 21 04:46:52.991925 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:52.991900 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k6sl\" (UniqueName: \"kubernetes.io/projected/6dded86f-c420-4864-a14f-3d349558ddf4-kube-api-access-7k6sl\") pod \"limitador-operator-controller-manager-85c4996f8c-q4vrd\" (UID: \"6dded86f-c420-4864-a14f-3d349558ddf4\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" Apr 21 04:46:53.065426 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:53.065410 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" Apr 21 04:46:53.180412 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:53.180348 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd"] Apr 21 04:46:53.182808 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:46:53.182782 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dded86f_c420_4864_a14f_3d349558ddf4.slice/crio-199d0d81a386c36dba245783a9d565123c98d981818d6e10f390e02fca4564a6 WatchSource:0}: Error finding container 199d0d81a386c36dba245783a9d565123c98d981818d6e10f390e02fca4564a6: Status 404 returned error can't find the container with id 199d0d81a386c36dba245783a9d565123c98d981818d6e10f390e02fca4564a6 Apr 21 04:46:53.593673 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:53.593630 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" event={"ID":"6dded86f-c420-4864-a14f-3d349558ddf4","Type":"ContainerStarted","Data":"199d0d81a386c36dba245783a9d565123c98d981818d6e10f390e02fca4564a6"} Apr 21 04:46:53.594515 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:53.594462 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm" event={"ID":"5b3dc471-0d48-47e9-a29a-d7e0617bc84d","Type":"ContainerStarted","Data":"bac73c3d18641d0baf857e6ac47841e1088ef17f6299f5c7033f5e543bcd9a9a"} Apr 21 04:46:55.603573 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:55.603537 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" event={"ID":"6dded86f-c420-4864-a14f-3d349558ddf4","Type":"ContainerStarted","Data":"ccbdb27ca7cbc1c532351bc21de229bcd4e11f83e2293173454191edceb9edbb"} Apr 21 04:46:55.604033 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:55.603684 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" Apr 21 04:46:55.621362 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:46:55.621311 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" podStartSLOduration=1.527801414 podStartE2EDuration="3.621291589s" podCreationTimestamp="2026-04-21 04:46:52 +0000 UTC" firstStartedPulling="2026-04-21 04:46:53.184649678 +0000 UTC m=+499.579011397" lastFinishedPulling="2026-04-21 04:46:55.278139839 +0000 UTC m=+501.672501572" observedRunningTime="2026-04-21 04:46:55.620529543 +0000 UTC m=+502.014891280" watchObservedRunningTime="2026-04-21 04:46:55.621291589 +0000 UTC m=+502.015653330" Apr 21 04:47:00.585662 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:00.585630 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" Apr 21 04:47:02.356262 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.356225 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g"] Apr 21 04:47:02.356738 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.356470 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" podUID="a326b59c-7538-458d-867d-4803904a0154" containerName="manager" containerID="cri-o://371ee4577287a81fcd2ec0ed04e73704c2ac670be407caf43a1cb08795b273a0" gracePeriod=2 Apr 21 04:47:02.358796 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.358762 2570 status_manager.go:895] "Failed to get status for pod" podUID="a326b59c-7538-458d-867d-4803904a0154" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-r947g\" is forbidden: User \"system:node:ip-10-0-140-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-11.ec2.internal' and this object" Apr 21 04:47:02.360725 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.360669 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g"] Apr 21 04:47:02.371417 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.371366 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd"] Apr 21 04:47:02.372126 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.372100 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" podUID="6dded86f-c420-4864-a14f-3d349558ddf4" containerName="manager" containerID="cri-o://ccbdb27ca7cbc1c532351bc21de229bcd4e11f83e2293173454191edceb9edbb" gracePeriod=2 Apr 21 04:47:02.373756 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.373727 2570 status_manager.go:895] "Failed to get status for pod" podUID="a326b59c-7538-458d-867d-4803904a0154" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-r947g\" is forbidden: User \"system:node:ip-10-0-140-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-11.ec2.internal' and this object" Apr 21 04:47:02.374284 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.374000 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" Apr 21 04:47:02.379353 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.379329 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd"] Apr 21 04:47:02.382374 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.382195 2570 status_manager.go:895] "Failed to get status for pod" podUID="6dded86f-c420-4864-a14f-3d349558ddf4" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" err="pods \"limitador-operator-controller-manager-85c4996f8c-q4vrd\" is forbidden: User \"system:node:ip-10-0-140-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-11.ec2.internal' and this object" Apr 21 04:47:02.383895 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.383866 2570 status_manager.go:895] "Failed to get status for pod" podUID="a326b59c-7538-458d-867d-4803904a0154" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-r947g\" is forbidden: User \"system:node:ip-10-0-140-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-11.ec2.internal' and this object" Apr 21 04:47:02.399438 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.399059 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9tdqx"] Apr 21 04:47:02.399438 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.399403 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a326b59c-7538-458d-867d-4803904a0154" containerName="manager" Apr 21 04:47:02.399438 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.399419 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a326b59c-7538-458d-867d-4803904a0154" containerName="manager" Apr 21 04:47:02.399438 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.399437 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6dded86f-c420-4864-a14f-3d349558ddf4" containerName="manager" Apr 21 04:47:02.399438 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.399446 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dded86f-c420-4864-a14f-3d349558ddf4" containerName="manager" Apr 21 04:47:02.399769 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.399539 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="a326b59c-7538-458d-867d-4803904a0154" containerName="manager" Apr 21 04:47:02.399769 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.399553 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6dded86f-c420-4864-a14f-3d349558ddf4" containerName="manager" Apr 21 04:47:02.403605 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.403585 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9tdqx" Apr 21 04:47:02.406087 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.406057 2570 status_manager.go:895] "Failed to get status for pod" podUID="6dded86f-c420-4864-a14f-3d349558ddf4" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" err="pods \"limitador-operator-controller-manager-85c4996f8c-q4vrd\" is forbidden: User \"system:node:ip-10-0-140-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-11.ec2.internal' and this object" Apr 21 04:47:02.414613 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.414588 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9tdqx"] Apr 21 04:47:02.429209 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.429167 2570 status_manager.go:895] "Failed to get status for pod" podUID="a326b59c-7538-458d-867d-4803904a0154" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-r947g\" is forbidden: User \"system:node:ip-10-0-140-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-11.ec2.internal' and this object" Apr 21 04:47:02.563294 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.563259 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsmmp\" (UniqueName: \"kubernetes.io/projected/7abebefe-81ff-403d-b854-b622a7aa705e-kube-api-access-lsmmp\") pod \"limitador-operator-controller-manager-85c4996f8c-9tdqx\" (UID: \"7abebefe-81ff-403d-b854-b622a7aa705e\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9tdqx" Apr 21 04:47:02.664257 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.664164 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsmmp\" (UniqueName: \"kubernetes.io/projected/7abebefe-81ff-403d-b854-b622a7aa705e-kube-api-access-lsmmp\") pod \"limitador-operator-controller-manager-85c4996f8c-9tdqx\" (UID: \"7abebefe-81ff-403d-b854-b622a7aa705e\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9tdqx" Apr 21 04:47:02.672196 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.672166 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsmmp\" (UniqueName: \"kubernetes.io/projected/7abebefe-81ff-403d-b854-b622a7aa705e-kube-api-access-lsmmp\") pod \"limitador-operator-controller-manager-85c4996f8c-9tdqx\" (UID: \"7abebefe-81ff-403d-b854-b622a7aa705e\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9tdqx" Apr 21 04:47:02.774154 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:02.774116 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9tdqx" Apr 21 04:47:04.219238 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:04.219196 2570 status_manager.go:895] "Failed to get status for pod" podUID="6dded86f-c420-4864-a14f-3d349558ddf4" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" err="pods \"limitador-operator-controller-manager-85c4996f8c-q4vrd\" is forbidden: User \"system:node:ip-10-0-140-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-11.ec2.internal' and this object" Apr 21 04:47:04.221061 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:04.221024 2570 status_manager.go:895] "Failed to get status for pod" podUID="a326b59c-7538-458d-867d-4803904a0154" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-r947g\" is forbidden: User \"system:node:ip-10-0-140-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-11.ec2.internal' and this object" Apr 21 04:47:15.673203 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.673169 2570 generic.go:358] "Generic (PLEG): container finished" podID="a326b59c-7538-458d-867d-4803904a0154" containerID="371ee4577287a81fcd2ec0ed04e73704c2ac670be407caf43a1cb08795b273a0" exitCode=0 Apr 21 04:47:15.674614 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.674584 2570 generic.go:358] "Generic (PLEG): container finished" podID="6dded86f-c420-4864-a14f-3d349558ddf4" containerID="ccbdb27ca7cbc1c532351bc21de229bcd4e11f83e2293173454191edceb9edbb" exitCode=0 Apr 21 04:47:15.674716 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.674633 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="199d0d81a386c36dba245783a9d565123c98d981818d6e10f390e02fca4564a6" Apr 21 04:47:15.676929 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.676910 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" Apr 21 04:47:15.679075 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.679039 2570 status_manager.go:895] "Failed to get status for pod" podUID="6dded86f-c420-4864-a14f-3d349558ddf4" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" err="pods \"limitador-operator-controller-manager-85c4996f8c-q4vrd\" is forbidden: User \"system:node:ip-10-0-140-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-11.ec2.internal' and this object" Apr 21 04:47:15.681243 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.681225 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" Apr 21 04:47:15.683031 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.683011 2570 status_manager.go:895] "Failed to get status for pod" podUID="a326b59c-7538-458d-867d-4803904a0154" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-r947g\" is forbidden: User \"system:node:ip-10-0-140-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-11.ec2.internal' and this object" Apr 21 04:47:15.684552 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.684534 2570 status_manager.go:895] "Failed to get status for pod" podUID="6dded86f-c420-4864-a14f-3d349558ddf4" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" err="pods \"limitador-operator-controller-manager-85c4996f8c-q4vrd\" is forbidden: User \"system:node:ip-10-0-140-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-11.ec2.internal' and this object" Apr 21 04:47:15.765667 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.765624 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqrmj\" (UniqueName: \"kubernetes.io/projected/a326b59c-7538-458d-867d-4803904a0154-kube-api-access-wqrmj\") pod \"a326b59c-7538-458d-867d-4803904a0154\" (UID: \"a326b59c-7538-458d-867d-4803904a0154\") " Apr 21 04:47:15.765667 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.765679 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k6sl\" (UniqueName: \"kubernetes.io/projected/6dded86f-c420-4864-a14f-3d349558ddf4-kube-api-access-7k6sl\") pod \"6dded86f-c420-4864-a14f-3d349558ddf4\" (UID: \"6dded86f-c420-4864-a14f-3d349558ddf4\") " Apr 21 04:47:15.765897 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.765714 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a326b59c-7538-458d-867d-4803904a0154-extensions-socket-volume\") pod \"a326b59c-7538-458d-867d-4803904a0154\" (UID: \"a326b59c-7538-458d-867d-4803904a0154\") " Apr 21 04:47:15.766248 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.766220 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a326b59c-7538-458d-867d-4803904a0154-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "a326b59c-7538-458d-867d-4803904a0154" (UID: "a326b59c-7538-458d-867d-4803904a0154"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:47:15.767613 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.767590 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a326b59c-7538-458d-867d-4803904a0154-kube-api-access-wqrmj" (OuterVolumeSpecName: "kube-api-access-wqrmj") pod "a326b59c-7538-458d-867d-4803904a0154" (UID: "a326b59c-7538-458d-867d-4803904a0154"). InnerVolumeSpecName "kube-api-access-wqrmj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:47:15.767750 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.767718 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dded86f-c420-4864-a14f-3d349558ddf4-kube-api-access-7k6sl" (OuterVolumeSpecName: "kube-api-access-7k6sl") pod "6dded86f-c420-4864-a14f-3d349558ddf4" (UID: "6dded86f-c420-4864-a14f-3d349558ddf4"). InnerVolumeSpecName "kube-api-access-7k6sl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:47:15.866754 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.866726 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7k6sl\" (UniqueName: \"kubernetes.io/projected/6dded86f-c420-4864-a14f-3d349558ddf4-kube-api-access-7k6sl\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:47:15.866754 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.866750 2570 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a326b59c-7538-458d-867d-4803904a0154-extensions-socket-volume\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:47:15.866754 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.866760 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wqrmj\" (UniqueName: \"kubernetes.io/projected/a326b59c-7538-458d-867d-4803904a0154-kube-api-access-wqrmj\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:47:15.876886 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:15.876858 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9tdqx"] Apr 21 04:47:15.880010 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:47:15.879986 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7abebefe_81ff_403d_b854_b622a7aa705e.slice/crio-d1f2eb0c3dc787903945ccbce44401e5abceeef753528e9444c04561106b717a WatchSource:0}: Error finding container d1f2eb0c3dc787903945ccbce44401e5abceeef753528e9444c04561106b717a: Status 404 returned error can't find the container with id d1f2eb0c3dc787903945ccbce44401e5abceeef753528e9444c04561106b717a Apr 21 04:47:16.219925 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:16.219829 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dded86f-c420-4864-a14f-3d349558ddf4" path="/var/lib/kubelet/pods/6dded86f-c420-4864-a14f-3d349558ddf4/volumes" Apr 21 04:47:16.220290 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:16.220267 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a326b59c-7538-458d-867d-4803904a0154" path="/var/lib/kubelet/pods/a326b59c-7538-458d-867d-4803904a0154/volumes" Apr 21 04:47:16.678363 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:16.678335 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" Apr 21 04:47:16.678849 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:16.678325 2570 scope.go:117] "RemoveContainer" containerID="371ee4577287a81fcd2ec0ed04e73704c2ac670be407caf43a1cb08795b273a0" Apr 21 04:47:16.679856 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:16.679826 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm" event={"ID":"5b3dc471-0d48-47e9-a29a-d7e0617bc84d","Type":"ContainerStarted","Data":"9b0b7a7554bc79d572239ee1045c5311979c19f9a29db85b598f7d27bf52ec57"} Apr 21 04:47:16.680694 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:16.680654 2570 status_manager.go:895] "Failed to get status for pod" podUID="a326b59c-7538-458d-867d-4803904a0154" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-r947g\" is forbidden: User \"system:node:ip-10-0-140-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-11.ec2.internal' and this object" Apr 21 04:47:16.681528 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:16.681488 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9tdqx" event={"ID":"7abebefe-81ff-403d-b854-b622a7aa705e","Type":"ContainerStarted","Data":"bd08a3077af72f85cdc50ba3a328e98d988351da481064c2cb91daa875f660e8"} Apr 21 04:47:16.681627 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:16.681534 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9tdqx" event={"ID":"7abebefe-81ff-403d-b854-b622a7aa705e","Type":"ContainerStarted","Data":"d1f2eb0c3dc787903945ccbce44401e5abceeef753528e9444c04561106b717a"} Apr 21 04:47:16.681627 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:16.681541 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" Apr 21 04:47:16.681721 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:16.681635 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9tdqx" Apr 21 04:47:16.702326 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:16.702287 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-bfnvm" podStartSLOduration=3.02025332 podStartE2EDuration="25.702273859s" podCreationTimestamp="2026-04-21 04:46:51 +0000 UTC" firstStartedPulling="2026-04-21 04:46:52.957667188 +0000 UTC m=+499.352028908" lastFinishedPulling="2026-04-21 04:47:15.639687723 +0000 UTC m=+522.034049447" observedRunningTime="2026-04-21 04:47:16.701246226 +0000 UTC m=+523.095607978" watchObservedRunningTime="2026-04-21 04:47:16.702273859 +0000 UTC m=+523.096635599" Apr 21 04:47:16.703138 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:16.703117 2570 status_manager.go:895] "Failed to get status for pod" podUID="a326b59c-7538-458d-867d-4803904a0154" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-r947g\" is forbidden: User \"system:node:ip-10-0-140-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-11.ec2.internal' and this object" Apr 21 04:47:16.704744 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:16.704724 2570 status_manager.go:895] "Failed to get status for pod" podUID="6dded86f-c420-4864-a14f-3d349558ddf4" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-q4vrd" err="pods \"limitador-operator-controller-manager-85c4996f8c-q4vrd\" is forbidden: User \"system:node:ip-10-0-140-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-11.ec2.internal' and this object" Apr 21 04:47:16.728126 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:16.728077 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9tdqx" podStartSLOduration=14.728061983 podStartE2EDuration="14.728061983s" podCreationTimestamp="2026-04-21 04:47:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:47:16.72760111 +0000 UTC m=+523.121962849" watchObservedRunningTime="2026-04-21 04:47:16.728061983 +0000 UTC m=+523.122423928" Apr 21 04:47:16.729581 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:16.729553 2570 status_manager.go:895] "Failed to get status for pod" podUID="a326b59c-7538-458d-867d-4803904a0154" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-r947g" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-r947g\" is forbidden: User \"system:node:ip-10-0-140-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-11.ec2.internal' and this object" Apr 21 04:47:27.688767 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:27.688686 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9tdqx" Apr 21 04:47:35.536762 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.536731 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9"] Apr 21 04:47:35.611558 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.611525 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9"] Apr 21 04:47:35.611697 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.611633 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.615034 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.615014 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-zcz4w\"" Apr 21 04:47:35.709888 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.709859 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5e853989-b263-48f1-ae11-7871beb8ab55-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.710030 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.709896 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5e853989-b263-48f1-ae11-7871beb8ab55-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.710030 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.709922 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5e853989-b263-48f1-ae11-7871beb8ab55-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.710030 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.709939 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5e853989-b263-48f1-ae11-7871beb8ab55-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.710030 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.710017 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5hzg\" (UniqueName: \"kubernetes.io/projected/5e853989-b263-48f1-ae11-7871beb8ab55-kube-api-access-f5hzg\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.710195 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.710060 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5e853989-b263-48f1-ae11-7871beb8ab55-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.710195 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.710108 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5e853989-b263-48f1-ae11-7871beb8ab55-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.710195 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.710148 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5e853989-b263-48f1-ae11-7871beb8ab55-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.710195 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.710165 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5e853989-b263-48f1-ae11-7871beb8ab55-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.810829 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.810758 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5e853989-b263-48f1-ae11-7871beb8ab55-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.810829 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.810792 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5e853989-b263-48f1-ae11-7871beb8ab55-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.810829 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.810825 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5e853989-b263-48f1-ae11-7871beb8ab55-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.811060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.810943 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5e853989-b263-48f1-ae11-7871beb8ab55-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.811060 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.810988 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5e853989-b263-48f1-ae11-7871beb8ab55-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.811163 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.811093 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5e853989-b263-48f1-ae11-7871beb8ab55-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.811163 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.811154 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5hzg\" (UniqueName: \"kubernetes.io/projected/5e853989-b263-48f1-ae11-7871beb8ab55-kube-api-access-f5hzg\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.811267 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.811196 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5e853989-b263-48f1-ae11-7871beb8ab55-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.811267 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.811229 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5e853989-b263-48f1-ae11-7871beb8ab55-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.811368 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.811289 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5e853989-b263-48f1-ae11-7871beb8ab55-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.811368 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.811329 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5e853989-b263-48f1-ae11-7871beb8ab55-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.811553 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.811527 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5e853989-b263-48f1-ae11-7871beb8ab55-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.811615 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.811563 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5e853989-b263-48f1-ae11-7871beb8ab55-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.811876 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.811856 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5e853989-b263-48f1-ae11-7871beb8ab55-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.813233 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.813215 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5e853989-b263-48f1-ae11-7871beb8ab55-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.813342 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.813322 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5e853989-b263-48f1-ae11-7871beb8ab55-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.819845 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.819823 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5e853989-b263-48f1-ae11-7871beb8ab55-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.820878 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.820859 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5hzg\" (UniqueName: \"kubernetes.io/projected/5e853989-b263-48f1-ae11-7871beb8ab55-kube-api-access-f5hzg\") pod \"maas-default-gateway-openshift-default-845c6b4b48-xrgv9\" (UID: \"5e853989-b263-48f1-ae11-7871beb8ab55\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:35.920685 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:35.920667 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:36.037891 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:36.037859 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9"] Apr 21 04:47:36.041799 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:47:36.041768 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e853989_b263_48f1_ae11_7871beb8ab55.slice/crio-3f4db0455c24519b3d9c621fead12898d84804b04e719c3cf059c4aaf1cc5bbe WatchSource:0}: Error finding container 3f4db0455c24519b3d9c621fead12898d84804b04e719c3cf059c4aaf1cc5bbe: Status 404 returned error can't find the container with id 3f4db0455c24519b3d9c621fead12898d84804b04e719c3cf059c4aaf1cc5bbe Apr 21 04:47:36.043929 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:36.043899 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 04:47:36.044002 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:36.043966 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 04:47:36.044002 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:36.043993 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 21 04:47:36.744576 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:36.744536 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" event={"ID":"5e853989-b263-48f1-ae11-7871beb8ab55","Type":"ContainerStarted","Data":"8ec19ab7c00452cfef622b30f699df98d9abd8889381bac3b7b21df3a67cd93b"} Apr 21 04:47:36.744955 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:36.744582 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" event={"ID":"5e853989-b263-48f1-ae11-7871beb8ab55","Type":"ContainerStarted","Data":"3f4db0455c24519b3d9c621fead12898d84804b04e719c3cf059c4aaf1cc5bbe"} Apr 21 04:47:36.766071 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:36.766005 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" podStartSLOduration=1.765989148 podStartE2EDuration="1.765989148s" podCreationTimestamp="2026-04-21 04:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:47:36.763188522 +0000 UTC m=+543.157550266" watchObservedRunningTime="2026-04-21 04:47:36.765989148 +0000 UTC m=+543.160350890" Apr 21 04:47:36.921446 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:36.921414 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:37.925539 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:37.925511 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:38.751429 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:38.751400 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:38.752302 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:38.752279 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-xrgv9" Apr 21 04:47:40.145667 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:40.145635 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-dtvwz"] Apr 21 04:47:40.148904 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:40.148885 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-dtvwz" Apr 21 04:47:40.152142 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:40.152121 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 04:47:40.169616 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:40.169592 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-dtvwz"] Apr 21 04:47:40.203813 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:40.203781 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-dtvwz"] Apr 21 04:47:40.248284 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:40.248258 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/30b776f1-fcaa-4447-9429-5d4c4a68c5c6-config-file\") pod \"limitador-limitador-78c99df468-dtvwz\" (UID: \"30b776f1-fcaa-4447-9429-5d4c4a68c5c6\") " pod="kuadrant-system/limitador-limitador-78c99df468-dtvwz" Apr 21 04:47:40.248432 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:40.248301 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt5w4\" (UniqueName: \"kubernetes.io/projected/30b776f1-fcaa-4447-9429-5d4c4a68c5c6-kube-api-access-tt5w4\") pod \"limitador-limitador-78c99df468-dtvwz\" (UID: \"30b776f1-fcaa-4447-9429-5d4c4a68c5c6\") " pod="kuadrant-system/limitador-limitador-78c99df468-dtvwz" Apr 21 04:47:40.349525 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:40.349476 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/30b776f1-fcaa-4447-9429-5d4c4a68c5c6-config-file\") pod \"limitador-limitador-78c99df468-dtvwz\" (UID: \"30b776f1-fcaa-4447-9429-5d4c4a68c5c6\") " pod="kuadrant-system/limitador-limitador-78c99df468-dtvwz" Apr 21 04:47:40.349687 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:40.349546 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tt5w4\" (UniqueName: \"kubernetes.io/projected/30b776f1-fcaa-4447-9429-5d4c4a68c5c6-kube-api-access-tt5w4\") pod \"limitador-limitador-78c99df468-dtvwz\" (UID: \"30b776f1-fcaa-4447-9429-5d4c4a68c5c6\") " pod="kuadrant-system/limitador-limitador-78c99df468-dtvwz" Apr 21 04:47:40.350044 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:40.350024 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/30b776f1-fcaa-4447-9429-5d4c4a68c5c6-config-file\") pod \"limitador-limitador-78c99df468-dtvwz\" (UID: \"30b776f1-fcaa-4447-9429-5d4c4a68c5c6\") " pod="kuadrant-system/limitador-limitador-78c99df468-dtvwz" Apr 21 04:47:40.358910 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:40.358891 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt5w4\" (UniqueName: \"kubernetes.io/projected/30b776f1-fcaa-4447-9429-5d4c4a68c5c6-kube-api-access-tt5w4\") pod \"limitador-limitador-78c99df468-dtvwz\" (UID: \"30b776f1-fcaa-4447-9429-5d4c4a68c5c6\") " pod="kuadrant-system/limitador-limitador-78c99df468-dtvwz" Apr 21 04:47:40.459229 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:40.459170 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-dtvwz" Apr 21 04:47:40.593676 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:40.593641 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-dtvwz"] Apr 21 04:47:40.757789 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:40.757714 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-dtvwz" event={"ID":"30b776f1-fcaa-4447-9429-5d4c4a68c5c6","Type":"ContainerStarted","Data":"1f3435b7ef3db2b4e8e84c1316fc172c7bc592ea80a368287b0f508fec09680e"} Apr 21 04:47:43.769901 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:43.769869 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-dtvwz" event={"ID":"30b776f1-fcaa-4447-9429-5d4c4a68c5c6","Type":"ContainerStarted","Data":"ca905d636283db0438c5b0a31818ab3a4647a5e8121d5787863fc4077b857c98"} Apr 21 04:47:43.770260 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:43.770014 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-dtvwz" Apr 21 04:47:43.787173 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:43.787129 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-dtvwz" podStartSLOduration=1.319435292 podStartE2EDuration="3.787117385s" podCreationTimestamp="2026-04-21 04:47:40 +0000 UTC" firstStartedPulling="2026-04-21 04:47:40.594521293 +0000 UTC m=+546.988883013" lastFinishedPulling="2026-04-21 04:47:43.062203385 +0000 UTC m=+549.456565106" observedRunningTime="2026-04-21 04:47:43.785433221 +0000 UTC m=+550.179794973" watchObservedRunningTime="2026-04-21 04:47:43.787117385 +0000 UTC m=+550.181479124" Apr 21 04:47:54.774824 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:47:54.774795 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-dtvwz" Apr 21 04:49:15.779147 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:15.779113 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-5bf7dbf957-hdlrf"] Apr 21 04:49:15.782119 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:15.782102 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5bf7dbf957-hdlrf" Apr 21 04:49:15.784353 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:15.784329 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 21 04:49:15.784452 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:15.784328 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 21 04:49:15.790107 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:15.790085 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5bf7dbf957-hdlrf"] Apr 21 04:49:15.868593 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:15.868562 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw4bf\" (UniqueName: \"kubernetes.io/projected/3f358c94-25f5-48f1-ba47-a4592d8dde66-kube-api-access-vw4bf\") pod \"maas-api-5bf7dbf957-hdlrf\" (UID: \"3f358c94-25f5-48f1-ba47-a4592d8dde66\") " pod="opendatahub/maas-api-5bf7dbf957-hdlrf" Apr 21 04:49:15.868743 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:15.868599 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3f358c94-25f5-48f1-ba47-a4592d8dde66-maas-api-tls\") pod \"maas-api-5bf7dbf957-hdlrf\" (UID: \"3f358c94-25f5-48f1-ba47-a4592d8dde66\") " pod="opendatahub/maas-api-5bf7dbf957-hdlrf" Apr 21 04:49:15.969047 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:15.969011 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vw4bf\" (UniqueName: \"kubernetes.io/projected/3f358c94-25f5-48f1-ba47-a4592d8dde66-kube-api-access-vw4bf\") pod \"maas-api-5bf7dbf957-hdlrf\" (UID: \"3f358c94-25f5-48f1-ba47-a4592d8dde66\") " pod="opendatahub/maas-api-5bf7dbf957-hdlrf" Apr 21 04:49:15.969047 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:15.969052 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3f358c94-25f5-48f1-ba47-a4592d8dde66-maas-api-tls\") pod \"maas-api-5bf7dbf957-hdlrf\" (UID: \"3f358c94-25f5-48f1-ba47-a4592d8dde66\") " pod="opendatahub/maas-api-5bf7dbf957-hdlrf" Apr 21 04:49:15.971590 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:15.971563 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3f358c94-25f5-48f1-ba47-a4592d8dde66-maas-api-tls\") pod \"maas-api-5bf7dbf957-hdlrf\" (UID: \"3f358c94-25f5-48f1-ba47-a4592d8dde66\") " pod="opendatahub/maas-api-5bf7dbf957-hdlrf" Apr 21 04:49:15.980694 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:15.979611 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw4bf\" (UniqueName: \"kubernetes.io/projected/3f358c94-25f5-48f1-ba47-a4592d8dde66-kube-api-access-vw4bf\") pod \"maas-api-5bf7dbf957-hdlrf\" (UID: \"3f358c94-25f5-48f1-ba47-a4592d8dde66\") " pod="opendatahub/maas-api-5bf7dbf957-hdlrf" Apr 21 04:49:16.092761 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:16.092727 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5bf7dbf957-hdlrf" Apr 21 04:49:16.218514 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:49:16.218465 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f358c94_25f5_48f1_ba47_a4592d8dde66.slice/crio-34e679642341fe5342897debf213beb69602e8ef5dfab60dbce6d477fe9f59ae WatchSource:0}: Error finding container 34e679642341fe5342897debf213beb69602e8ef5dfab60dbce6d477fe9f59ae: Status 404 returned error can't find the container with id 34e679642341fe5342897debf213beb69602e8ef5dfab60dbce6d477fe9f59ae Apr 21 04:49:16.221747 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:16.221725 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5bf7dbf957-hdlrf"] Apr 21 04:49:16.353444 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:16.353358 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-dtvwz"] Apr 21 04:49:17.049809 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:17.049756 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5bf7dbf957-hdlrf" event={"ID":"3f358c94-25f5-48f1-ba47-a4592d8dde66","Type":"ContainerStarted","Data":"34e679642341fe5342897debf213beb69602e8ef5dfab60dbce6d477fe9f59ae"} Apr 21 04:49:19.058418 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:19.058357 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5bf7dbf957-hdlrf" event={"ID":"3f358c94-25f5-48f1-ba47-a4592d8dde66","Type":"ContainerStarted","Data":"c185b14f6d1090dcaa0f43105aba128e05b5ba29b3007390f465c0f239c1a538"} Apr 21 04:49:19.059105 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:19.059082 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-5bf7dbf957-hdlrf" Apr 21 04:49:19.078824 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:19.078751 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-5bf7dbf957-hdlrf" podStartSLOduration=1.6150534890000001 podStartE2EDuration="4.078736704s" podCreationTimestamp="2026-04-21 04:49:15 +0000 UTC" firstStartedPulling="2026-04-21 04:49:16.219517974 +0000 UTC m=+642.613879695" lastFinishedPulling="2026-04-21 04:49:18.683201185 +0000 UTC m=+645.077562910" observedRunningTime="2026-04-21 04:49:19.078538844 +0000 UTC m=+645.472900584" watchObservedRunningTime="2026-04-21 04:49:19.078736704 +0000 UTC m=+645.473098445" Apr 21 04:49:26.069410 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:49:26.069381 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-5bf7dbf957-hdlrf" Apr 21 04:50:01.928822 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:01.928783 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-5bf7dbf957-hdlrf"] Apr 21 04:50:01.929302 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:01.929050 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-5bf7dbf957-hdlrf" podUID="3f358c94-25f5-48f1-ba47-a4592d8dde66" containerName="maas-api" containerID="cri-o://c185b14f6d1090dcaa0f43105aba128e05b5ba29b3007390f465c0f239c1a538" gracePeriod=30 Apr 21 04:50:02.167748 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:02.167714 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5bf7dbf957-hdlrf" Apr 21 04:50:02.195343 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:02.195264 2570 generic.go:358] "Generic (PLEG): container finished" podID="3f358c94-25f5-48f1-ba47-a4592d8dde66" containerID="c185b14f6d1090dcaa0f43105aba128e05b5ba29b3007390f465c0f239c1a538" exitCode=0 Apr 21 04:50:02.195343 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:02.195328 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5bf7dbf957-hdlrf" Apr 21 04:50:02.195343 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:02.195339 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5bf7dbf957-hdlrf" event={"ID":"3f358c94-25f5-48f1-ba47-a4592d8dde66","Type":"ContainerDied","Data":"c185b14f6d1090dcaa0f43105aba128e05b5ba29b3007390f465c0f239c1a538"} Apr 21 04:50:02.195588 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:02.195368 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5bf7dbf957-hdlrf" event={"ID":"3f358c94-25f5-48f1-ba47-a4592d8dde66","Type":"ContainerDied","Data":"34e679642341fe5342897debf213beb69602e8ef5dfab60dbce6d477fe9f59ae"} Apr 21 04:50:02.195588 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:02.195383 2570 scope.go:117] "RemoveContainer" containerID="c185b14f6d1090dcaa0f43105aba128e05b5ba29b3007390f465c0f239c1a538" Apr 21 04:50:02.203064 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:02.203042 2570 scope.go:117] "RemoveContainer" containerID="c185b14f6d1090dcaa0f43105aba128e05b5ba29b3007390f465c0f239c1a538" Apr 21 04:50:02.203346 ip-10-0-140-11 kubenswrapper[2570]: E0421 04:50:02.203325 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c185b14f6d1090dcaa0f43105aba128e05b5ba29b3007390f465c0f239c1a538\": container with ID starting with c185b14f6d1090dcaa0f43105aba128e05b5ba29b3007390f465c0f239c1a538 not found: ID does not exist" containerID="c185b14f6d1090dcaa0f43105aba128e05b5ba29b3007390f465c0f239c1a538" Apr 21 04:50:02.203389 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:02.203356 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c185b14f6d1090dcaa0f43105aba128e05b5ba29b3007390f465c0f239c1a538"} err="failed to get container status \"c185b14f6d1090dcaa0f43105aba128e05b5ba29b3007390f465c0f239c1a538\": rpc error: code = NotFound desc = could not find container \"c185b14f6d1090dcaa0f43105aba128e05b5ba29b3007390f465c0f239c1a538\": container with ID starting with c185b14f6d1090dcaa0f43105aba128e05b5ba29b3007390f465c0f239c1a538 not found: ID does not exist" Apr 21 04:50:02.245038 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:02.245016 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3f358c94-25f5-48f1-ba47-a4592d8dde66-maas-api-tls\") pod \"3f358c94-25f5-48f1-ba47-a4592d8dde66\" (UID: \"3f358c94-25f5-48f1-ba47-a4592d8dde66\") " Apr 21 04:50:02.245165 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:02.245069 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw4bf\" (UniqueName: \"kubernetes.io/projected/3f358c94-25f5-48f1-ba47-a4592d8dde66-kube-api-access-vw4bf\") pod \"3f358c94-25f5-48f1-ba47-a4592d8dde66\" (UID: \"3f358c94-25f5-48f1-ba47-a4592d8dde66\") " Apr 21 04:50:02.247211 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:02.247183 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f358c94-25f5-48f1-ba47-a4592d8dde66-kube-api-access-vw4bf" (OuterVolumeSpecName: "kube-api-access-vw4bf") pod "3f358c94-25f5-48f1-ba47-a4592d8dde66" (UID: "3f358c94-25f5-48f1-ba47-a4592d8dde66"). InnerVolumeSpecName "kube-api-access-vw4bf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:50:02.247313 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:02.247273 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f358c94-25f5-48f1-ba47-a4592d8dde66-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "3f358c94-25f5-48f1-ba47-a4592d8dde66" (UID: "3f358c94-25f5-48f1-ba47-a4592d8dde66"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:50:02.345882 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:02.345845 2570 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3f358c94-25f5-48f1-ba47-a4592d8dde66-maas-api-tls\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:50:02.345882 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:02.345881 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vw4bf\" (UniqueName: \"kubernetes.io/projected/3f358c94-25f5-48f1-ba47-a4592d8dde66-kube-api-access-vw4bf\") on node \"ip-10-0-140-11.ec2.internal\" DevicePath \"\"" Apr 21 04:50:02.524089 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:02.524058 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-5bf7dbf957-hdlrf"] Apr 21 04:50:02.527625 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:02.527603 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-5bf7dbf957-hdlrf"] Apr 21 04:50:04.219187 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:04.219153 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f358c94-25f5-48f1-ba47-a4592d8dde66" path="/var/lib/kubelet/pods/3f358c94-25f5-48f1-ba47-a4592d8dde66/volumes" Apr 21 04:50:27.027591 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:27.027513 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-dtvwz"] Apr 21 04:50:34.732364 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:34.732326 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-dtvwz"] Apr 21 04:50:38.494519 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.494464 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj"] Apr 21 04:50:38.494981 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.494740 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f358c94-25f5-48f1-ba47-a4592d8dde66" containerName="maas-api" Apr 21 04:50:38.494981 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.494751 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f358c94-25f5-48f1-ba47-a4592d8dde66" containerName="maas-api" Apr 21 04:50:38.494981 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.494805 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f358c94-25f5-48f1-ba47-a4592d8dde66" containerName="maas-api" Apr 21 04:50:38.499148 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.499125 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.501517 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.501476 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 21 04:50:38.502458 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.502431 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-pxf66\"" Apr 21 04:50:38.502572 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.502481 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 21 04:50:38.502621 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.502587 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 21 04:50:38.509674 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.509647 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj"] Apr 21 04:50:38.605543 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.605510 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d09563ea-7080-4fc7-abd1-78d54ef48e21-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.605699 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.605557 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d09563ea-7080-4fc7-abd1-78d54ef48e21-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.605699 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.605625 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d09563ea-7080-4fc7-abd1-78d54ef48e21-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.605699 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.605670 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d09563ea-7080-4fc7-abd1-78d54ef48e21-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.605699 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.605692 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbkq5\" (UniqueName: \"kubernetes.io/projected/d09563ea-7080-4fc7-abd1-78d54ef48e21-kube-api-access-zbkq5\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.605836 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.605726 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d09563ea-7080-4fc7-abd1-78d54ef48e21-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.706387 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.706350 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d09563ea-7080-4fc7-abd1-78d54ef48e21-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.706567 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.706404 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d09563ea-7080-4fc7-abd1-78d54ef48e21-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.706567 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.706446 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d09563ea-7080-4fc7-abd1-78d54ef48e21-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.706567 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.706486 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d09563ea-7080-4fc7-abd1-78d54ef48e21-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.706567 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.706535 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbkq5\" (UniqueName: \"kubernetes.io/projected/d09563ea-7080-4fc7-abd1-78d54ef48e21-kube-api-access-zbkq5\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.706567 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.706561 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d09563ea-7080-4fc7-abd1-78d54ef48e21-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.706999 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.706969 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d09563ea-7080-4fc7-abd1-78d54ef48e21-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.706999 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.706988 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d09563ea-7080-4fc7-abd1-78d54ef48e21-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.707158 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.707064 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d09563ea-7080-4fc7-abd1-78d54ef48e21-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.708684 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.708664 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d09563ea-7080-4fc7-abd1-78d54ef48e21-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.708993 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.708978 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d09563ea-7080-4fc7-abd1-78d54ef48e21-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.713995 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.713973 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbkq5\" (UniqueName: \"kubernetes.io/projected/d09563ea-7080-4fc7-abd1-78d54ef48e21-kube-api-access-zbkq5\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj\" (UID: \"d09563ea-7080-4fc7-abd1-78d54ef48e21\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.810886 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.810852 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:38.932621 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.932597 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj"] Apr 21 04:50:38.935200 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:50:38.935172 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd09563ea_7080_4fc7_abd1_78d54ef48e21.slice/crio-89317f3a42ce5954298af9c211ad6365c69aa76c4282338e105f0611364877ef WatchSource:0}: Error finding container 89317f3a42ce5954298af9c211ad6365c69aa76c4282338e105f0611364877ef: Status 404 returned error can't find the container with id 89317f3a42ce5954298af9c211ad6365c69aa76c4282338e105f0611364877ef Apr 21 04:50:38.937007 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:38.936988 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:50:39.311412 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:39.311365 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" event={"ID":"d09563ea-7080-4fc7-abd1-78d54ef48e21","Type":"ContainerStarted","Data":"89317f3a42ce5954298af9c211ad6365c69aa76c4282338e105f0611364877ef"} Apr 21 04:50:39.519881 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:39.519848 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-dtvwz"] Apr 21 04:50:44.328371 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:44.328332 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" event={"ID":"d09563ea-7080-4fc7-abd1-78d54ef48e21","Type":"ContainerStarted","Data":"24ff44b04b11287d074ffc0b75a26b697ed18770c6e7862ee1c4df639841340c"} Apr 21 04:50:49.345566 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:49.345527 2570 generic.go:358] "Generic (PLEG): container finished" podID="d09563ea-7080-4fc7-abd1-78d54ef48e21" containerID="24ff44b04b11287d074ffc0b75a26b697ed18770c6e7862ee1c4df639841340c" exitCode=0 Apr 21 04:50:49.345948 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:49.345597 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" event={"ID":"d09563ea-7080-4fc7-abd1-78d54ef48e21","Type":"ContainerDied","Data":"24ff44b04b11287d074ffc0b75a26b697ed18770c6e7862ee1c4df639841340c"} Apr 21 04:50:50.096743 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.096698 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p"] Apr 21 04:50:50.100388 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.100365 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.102943 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.102922 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 21 04:50:50.114371 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.114344 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p"] Apr 21 04:50:50.208075 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.208043 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c96b272e-3ec7-4298-8697-437f5a13f5a0-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.208242 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.208094 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c96b272e-3ec7-4298-8697-437f5a13f5a0-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.208242 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.208174 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c96b272e-3ec7-4298-8697-437f5a13f5a0-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.208358 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.208239 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c96b272e-3ec7-4298-8697-437f5a13f5a0-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.208358 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.208264 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hs78\" (UniqueName: \"kubernetes.io/projected/c96b272e-3ec7-4298-8697-437f5a13f5a0-kube-api-access-9hs78\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.208358 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.208315 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c96b272e-3ec7-4298-8697-437f5a13f5a0-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.309618 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.309576 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c96b272e-3ec7-4298-8697-437f5a13f5a0-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.309810 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.309630 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c96b272e-3ec7-4298-8697-437f5a13f5a0-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.309810 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.309682 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c96b272e-3ec7-4298-8697-437f5a13f5a0-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.309810 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.309702 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hs78\" (UniqueName: \"kubernetes.io/projected/c96b272e-3ec7-4298-8697-437f5a13f5a0-kube-api-access-9hs78\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.309810 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.309752 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c96b272e-3ec7-4298-8697-437f5a13f5a0-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.309810 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.309799 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c96b272e-3ec7-4298-8697-437f5a13f5a0-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.310067 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.310044 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c96b272e-3ec7-4298-8697-437f5a13f5a0-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.310067 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.310059 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c96b272e-3ec7-4298-8697-437f5a13f5a0-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.310171 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.310145 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c96b272e-3ec7-4298-8697-437f5a13f5a0-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.312478 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.312432 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c96b272e-3ec7-4298-8697-437f5a13f5a0-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.312839 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.312816 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c96b272e-3ec7-4298-8697-437f5a13f5a0-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.319424 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.319396 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hs78\" (UniqueName: \"kubernetes.io/projected/c96b272e-3ec7-4298-8697-437f5a13f5a0-kube-api-access-9hs78\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p\" (UID: \"c96b272e-3ec7-4298-8697-437f5a13f5a0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.414161 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.414140 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:50.545584 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:50.545559 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p"] Apr 21 04:50:50.548340 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:50:50.548312 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc96b272e_3ec7_4298_8697_437f5a13f5a0.slice/crio-bafec0993e7bb227226c43bc4ca8212026cca7803997cdb57599ac64501254c9 WatchSource:0}: Error finding container bafec0993e7bb227226c43bc4ca8212026cca7803997cdb57599ac64501254c9: Status 404 returned error can't find the container with id bafec0993e7bb227226c43bc4ca8212026cca7803997cdb57599ac64501254c9 Apr 21 04:50:51.329653 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:51.329625 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-dtvwz"] Apr 21 04:50:51.357682 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:51.357628 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" event={"ID":"c96b272e-3ec7-4298-8697-437f5a13f5a0","Type":"ContainerStarted","Data":"ba0731ea4b4e560fb91637d7083366b93f24c14b858ae94965c8a5ed2a928bac"} Apr 21 04:50:51.357898 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:51.357878 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" event={"ID":"c96b272e-3ec7-4298-8697-437f5a13f5a0","Type":"ContainerStarted","Data":"bafec0993e7bb227226c43bc4ca8212026cca7803997cdb57599ac64501254c9"} Apr 21 04:50:51.359629 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:51.359600 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" event={"ID":"d09563ea-7080-4fc7-abd1-78d54ef48e21","Type":"ContainerStarted","Data":"348d18e9376d7fc78e19f4b8466f9ce83c3f170b993d37fa6ea811879a43fec8"} Apr 21 04:50:51.359829 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:51.359810 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:50:51.405015 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:51.404971 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" podStartSLOduration=1.935458252 podStartE2EDuration="13.404955763s" podCreationTimestamp="2026-04-21 04:50:38 +0000 UTC" firstStartedPulling="2026-04-21 04:50:38.937169401 +0000 UTC m=+725.331531123" lastFinishedPulling="2026-04-21 04:50:50.406666902 +0000 UTC m=+736.801028634" observedRunningTime="2026-04-21 04:50:51.403694766 +0000 UTC m=+737.798056517" watchObservedRunningTime="2026-04-21 04:50:51.404955763 +0000 UTC m=+737.799317502" Apr 21 04:50:56.376203 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:56.376169 2570 generic.go:358] "Generic (PLEG): container finished" podID="c96b272e-3ec7-4298-8697-437f5a13f5a0" containerID="ba0731ea4b4e560fb91637d7083366b93f24c14b858ae94965c8a5ed2a928bac" exitCode=0 Apr 21 04:50:56.376572 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:56.376248 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" event={"ID":"c96b272e-3ec7-4298-8697-437f5a13f5a0","Type":"ContainerDied","Data":"ba0731ea4b4e560fb91637d7083366b93f24c14b858ae94965c8a5ed2a928bac"} Apr 21 04:50:56.717961 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:56.717880 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-dtvwz"] Apr 21 04:50:57.380513 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:57.380468 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" event={"ID":"c96b272e-3ec7-4298-8697-437f5a13f5a0","Type":"ContainerStarted","Data":"dfed47d85b604baf7ad5ca29fa08163d049f4a633040630abba99f61aa3c129f"} Apr 21 04:50:57.380889 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:57.380701 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:50:57.401407 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:50:57.401359 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" podStartSLOduration=7.212105324 podStartE2EDuration="7.401343184s" podCreationTimestamp="2026-04-21 04:50:50 +0000 UTC" firstStartedPulling="2026-04-21 04:50:56.376913753 +0000 UTC m=+742.771275471" lastFinishedPulling="2026-04-21 04:50:56.56615161 +0000 UTC m=+742.960513331" observedRunningTime="2026-04-21 04:50:57.400089983 +0000 UTC m=+743.794451724" watchObservedRunningTime="2026-04-21 04:50:57.401343184 +0000 UTC m=+743.795704925" Apr 21 04:51:02.376003 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:02.375974 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj" Apr 21 04:51:08.396287 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:08.396254 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p" Apr 21 04:51:29.494544 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.494489 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr"] Apr 21 04:51:29.498066 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.498046 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.500458 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.500436 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 21 04:51:29.509346 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.509321 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr"] Apr 21 04:51:29.620715 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.620682 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.620715 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.620716 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.620919 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.620742 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.620919 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.620844 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk2dq\" (UniqueName: \"kubernetes.io/projected/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-kube-api-access-hk2dq\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.620919 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.620892 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.621016 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.620934 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.721328 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.721291 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.721473 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.721343 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.721546 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.721461 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.721546 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.721518 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.721636 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.721558 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.721636 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.721631 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hk2dq\" (UniqueName: \"kubernetes.io/projected/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-kube-api-access-hk2dq\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.721734 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.721655 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.721879 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.721861 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.721941 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.721913 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.723530 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.723487 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.723768 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.723750 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.729448 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.729419 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk2dq\" (UniqueName: \"kubernetes.io/projected/0cda8831-61cd-4003-9e30-5f2b1fd8dbe2-kube-api-access-hk2dq\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr\" (UID: \"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.808449 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.808416 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:29.933445 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:29.933417 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr"] Apr 21 04:51:29.935474 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:51:29.935444 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cda8831_61cd_4003_9e30_5f2b1fd8dbe2.slice/crio-e8a1a756919e6c270a7ce754b1caf7fbe4ab56ce21187d23eae1a3fda9255c12 WatchSource:0}: Error finding container e8a1a756919e6c270a7ce754b1caf7fbe4ab56ce21187d23eae1a3fda9255c12: Status 404 returned error can't find the container with id e8a1a756919e6c270a7ce754b1caf7fbe4ab56ce21187d23eae1a3fda9255c12 Apr 21 04:51:30.490653 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:30.490604 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" event={"ID":"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2","Type":"ContainerStarted","Data":"a65e3fe6136ac75774d04735d6a8f26a69e09ef8991296b666267377e00fcbc2"} Apr 21 04:51:30.490653 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:30.490648 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" event={"ID":"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2","Type":"ContainerStarted","Data":"e8a1a756919e6c270a7ce754b1caf7fbe4ab56ce21187d23eae1a3fda9255c12"} Apr 21 04:51:31.025558 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:31.025518 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-dtvwz"] Apr 21 04:51:35.506976 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:35.506890 2570 generic.go:358] "Generic (PLEG): container finished" podID="0cda8831-61cd-4003-9e30-5f2b1fd8dbe2" containerID="a65e3fe6136ac75774d04735d6a8f26a69e09ef8991296b666267377e00fcbc2" exitCode=0 Apr 21 04:51:35.506976 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:35.506965 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" event={"ID":"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2","Type":"ContainerDied","Data":"a65e3fe6136ac75774d04735d6a8f26a69e09ef8991296b666267377e00fcbc2"} Apr 21 04:51:36.511793 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:36.511759 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" event={"ID":"0cda8831-61cd-4003-9e30-5f2b1fd8dbe2","Type":"ContainerStarted","Data":"a05d0b960c0ccf378474bb9085cc00982229bb6b9f95c3308e73d9817ff8d1ff"} Apr 21 04:51:36.512198 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:36.511991 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:51:36.532934 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:36.532882 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" podStartSLOduration=7.359106219 podStartE2EDuration="7.532870945s" podCreationTimestamp="2026-04-21 04:51:29 +0000 UTC" firstStartedPulling="2026-04-21 04:51:35.507588961 +0000 UTC m=+781.901950678" lastFinishedPulling="2026-04-21 04:51:35.681353685 +0000 UTC m=+782.075715404" observedRunningTime="2026-04-21 04:51:36.531577028 +0000 UTC m=+782.925938767" watchObservedRunningTime="2026-04-21 04:51:36.532870945 +0000 UTC m=+782.927232740" Apr 21 04:51:47.528209 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:51:47.528180 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr" Apr 21 04:52:33.881411 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:33.881381 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-55ddb68486-ws65x_ff22f569-478b-4a9f-b91c-f437a2794ab9/manager/0.log" Apr 21 04:52:35.572880 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:35.572852 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-bfnvm_5b3dc471-0d48-47e9-a29a-d7e0617bc84d/kuadrant-console-plugin/0.log" Apr 21 04:52:35.682755 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:35.682721 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-6zvhn_8d587dde-69cc-4333-8982-70e6e4ceb562/registry-server/0.log" Apr 21 04:52:35.909381 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:35.909313 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-dtvwz_30b776f1-fcaa-4447-9429-5d4c4a68c5c6/limitador/0.log" Apr 21 04:52:36.019881 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:36.019856 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-9tdqx_7abebefe-81ff-403d-b854-b622a7aa705e/manager/0.log" Apr 21 04:52:36.368741 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:36.368706 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f7gchz_837f93dc-e520-4850-a838-f868fc265b37/istio-proxy/0.log" Apr 21 04:52:36.797778 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:36.797749 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-xrgv9_5e853989-b263-48f1-ae11-7871beb8ab55/istio-proxy/0.log" Apr 21 04:52:37.257593 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:37.257522 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p_c96b272e-3ec7-4298-8697-437f5a13f5a0/storage-initializer/0.log" Apr 21 04:52:37.265396 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:37.265365 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-kp25p_c96b272e-3ec7-4298-8697-437f5a13f5a0/main/0.log" Apr 21 04:52:37.381351 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:37.381322 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr_0cda8831-61cd-4003-9e30-5f2b1fd8dbe2/storage-initializer/0.log" Apr 21 04:52:37.387854 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:37.387833 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-vvwfr_0cda8831-61cd-4003-9e30-5f2b1fd8dbe2/main/0.log" Apr 21 04:52:37.615021 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:37.614993 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj_d09563ea-7080-4fc7-abd1-78d54ef48e21/storage-initializer/0.log" Apr 21 04:52:37.622905 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:37.622880 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccjhhrj_d09563ea-7080-4fc7-abd1-78d54ef48e21/main/0.log" Apr 21 04:52:41.505623 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:41.505586 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tffw4/must-gather-7mbpf"] Apr 21 04:52:41.508912 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:41.508890 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffw4/must-gather-7mbpf" Apr 21 04:52:41.511664 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:41.511640 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tffw4\"/\"default-dockercfg-49xnb\"" Apr 21 04:52:41.511786 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:41.511764 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tffw4\"/\"openshift-service-ca.crt\"" Apr 21 04:52:41.511861 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:41.511803 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tffw4\"/\"kube-root-ca.crt\"" Apr 21 04:52:41.519271 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:41.519251 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tffw4/must-gather-7mbpf"] Apr 21 04:52:41.579896 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:41.579872 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2b7279e0-1b91-43ea-8c73-5a4f57eb8470-must-gather-output\") pod \"must-gather-7mbpf\" (UID: \"2b7279e0-1b91-43ea-8c73-5a4f57eb8470\") " pod="openshift-must-gather-tffw4/must-gather-7mbpf" Apr 21 04:52:41.580010 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:41.579948 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rljf\" (UniqueName: \"kubernetes.io/projected/2b7279e0-1b91-43ea-8c73-5a4f57eb8470-kube-api-access-6rljf\") pod \"must-gather-7mbpf\" (UID: \"2b7279e0-1b91-43ea-8c73-5a4f57eb8470\") " pod="openshift-must-gather-tffw4/must-gather-7mbpf" Apr 21 04:52:41.680411 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:41.680369 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rljf\" (UniqueName: \"kubernetes.io/projected/2b7279e0-1b91-43ea-8c73-5a4f57eb8470-kube-api-access-6rljf\") pod \"must-gather-7mbpf\" (UID: \"2b7279e0-1b91-43ea-8c73-5a4f57eb8470\") " pod="openshift-must-gather-tffw4/must-gather-7mbpf" Apr 21 04:52:41.680610 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:41.680446 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2b7279e0-1b91-43ea-8c73-5a4f57eb8470-must-gather-output\") pod \"must-gather-7mbpf\" (UID: \"2b7279e0-1b91-43ea-8c73-5a4f57eb8470\") " pod="openshift-must-gather-tffw4/must-gather-7mbpf" Apr 21 04:52:41.680828 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:41.680807 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2b7279e0-1b91-43ea-8c73-5a4f57eb8470-must-gather-output\") pod \"must-gather-7mbpf\" (UID: \"2b7279e0-1b91-43ea-8c73-5a4f57eb8470\") " pod="openshift-must-gather-tffw4/must-gather-7mbpf" Apr 21 04:52:41.690696 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:41.690670 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rljf\" (UniqueName: \"kubernetes.io/projected/2b7279e0-1b91-43ea-8c73-5a4f57eb8470-kube-api-access-6rljf\") pod \"must-gather-7mbpf\" (UID: \"2b7279e0-1b91-43ea-8c73-5a4f57eb8470\") " pod="openshift-must-gather-tffw4/must-gather-7mbpf" Apr 21 04:52:41.818772 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:41.818731 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffw4/must-gather-7mbpf" Apr 21 04:52:41.955841 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:41.955816 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tffw4/must-gather-7mbpf"] Apr 21 04:52:41.956963 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:52:41.956937 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b7279e0_1b91_43ea_8c73_5a4f57eb8470.slice/crio-27a7636b0fc9702f05c1b202bffa9e705be20b3de6102e33fed9d8056f549ff4 WatchSource:0}: Error finding container 27a7636b0fc9702f05c1b202bffa9e705be20b3de6102e33fed9d8056f549ff4: Status 404 returned error can't find the container with id 27a7636b0fc9702f05c1b202bffa9e705be20b3de6102e33fed9d8056f549ff4 Apr 21 04:52:42.721297 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:42.721263 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffw4/must-gather-7mbpf" event={"ID":"2b7279e0-1b91-43ea-8c73-5a4f57eb8470","Type":"ContainerStarted","Data":"27a7636b0fc9702f05c1b202bffa9e705be20b3de6102e33fed9d8056f549ff4"} Apr 21 04:52:43.726517 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:43.726465 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffw4/must-gather-7mbpf" event={"ID":"2b7279e0-1b91-43ea-8c73-5a4f57eb8470","Type":"ContainerStarted","Data":"b7b4cb9817a0157e5d0c1adb5cf34b582bfe6932c0d2899dc595b66ea2909ab5"} Apr 21 04:52:43.726933 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:43.726535 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffw4/must-gather-7mbpf" event={"ID":"2b7279e0-1b91-43ea-8c73-5a4f57eb8470","Type":"ContainerStarted","Data":"13248c22a00bf0623403c130bf783ea439fa912ee8325642ec8249da0e89a06d"} Apr 21 04:52:43.745891 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:43.745828 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tffw4/must-gather-7mbpf" podStartSLOduration=1.920674969 podStartE2EDuration="2.745808624s" podCreationTimestamp="2026-04-21 04:52:41 +0000 UTC" firstStartedPulling="2026-04-21 04:52:41.958817002 +0000 UTC m=+848.353178720" lastFinishedPulling="2026-04-21 04:52:42.783950654 +0000 UTC m=+849.178312375" observedRunningTime="2026-04-21 04:52:43.743297565 +0000 UTC m=+850.137659297" watchObservedRunningTime="2026-04-21 04:52:43.745808624 +0000 UTC m=+850.140170364" Apr 21 04:52:44.392303 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:44.392269 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-2wxxz_a8a94be7-29d2-46f0-af2a-5a46e5fe8810/global-pull-secret-syncer/0.log" Apr 21 04:52:44.588475 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:44.588444 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-dw9qb_6282aefe-100f-4587-93df-5faf16b1e100/konnectivity-agent/0.log" Apr 21 04:52:44.645620 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:44.645544 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-11.ec2.internal_b067c9cf5db8c3de32f82b49fa084d46/haproxy/0.log" Apr 21 04:52:49.035563 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:49.035532 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-bfnvm_5b3dc471-0d48-47e9-a29a-d7e0617bc84d/kuadrant-console-plugin/0.log" Apr 21 04:52:49.066036 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:49.066006 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-6zvhn_8d587dde-69cc-4333-8982-70e6e4ceb562/registry-server/0.log" Apr 21 04:52:49.159350 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:49.159319 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-dtvwz_30b776f1-fcaa-4447-9429-5d4c4a68c5c6/limitador/0.log" Apr 21 04:52:49.194459 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:49.194425 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-9tdqx_7abebefe-81ff-403d-b854-b622a7aa705e/manager/0.log" Apr 21 04:52:51.055928 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:51.055885 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hxlxc_24a1ca69-0786-4476-865b-922206b6c523/node-exporter/0.log" Apr 21 04:52:51.077970 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:51.077942 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hxlxc_24a1ca69-0786-4476-865b-922206b6c523/kube-rbac-proxy/0.log" Apr 21 04:52:51.100613 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:51.100585 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hxlxc_24a1ca69-0786-4476-865b-922206b6c523/init-textfile/0.log" Apr 21 04:52:52.876466 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:52.876428 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk"] Apr 21 04:52:52.881624 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:52.881597 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:52.890555 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:52.890525 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk"] Apr 21 04:52:52.992153 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:52.992120 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af8c0af7-5d32-41ac-805d-e89cca22a23c-sys\") pod \"perf-node-gather-daemonset-rzcpk\" (UID: \"af8c0af7-5d32-41ac-805d-e89cca22a23c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:52.992357 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:52.992172 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af8c0af7-5d32-41ac-805d-e89cca22a23c-lib-modules\") pod \"perf-node-gather-daemonset-rzcpk\" (UID: \"af8c0af7-5d32-41ac-805d-e89cca22a23c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:52.992357 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:52.992273 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af8c0af7-5d32-41ac-805d-e89cca22a23c-proc\") pod \"perf-node-gather-daemonset-rzcpk\" (UID: \"af8c0af7-5d32-41ac-805d-e89cca22a23c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:52.992357 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:52.992298 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd944\" (UniqueName: \"kubernetes.io/projected/af8c0af7-5d32-41ac-805d-e89cca22a23c-kube-api-access-wd944\") pod \"perf-node-gather-daemonset-rzcpk\" (UID: \"af8c0af7-5d32-41ac-805d-e89cca22a23c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:52.992508 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:52.992389 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af8c0af7-5d32-41ac-805d-e89cca22a23c-podres\") pod \"perf-node-gather-daemonset-rzcpk\" (UID: \"af8c0af7-5d32-41ac-805d-e89cca22a23c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:53.093792 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:53.093750 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af8c0af7-5d32-41ac-805d-e89cca22a23c-proc\") pod \"perf-node-gather-daemonset-rzcpk\" (UID: \"af8c0af7-5d32-41ac-805d-e89cca22a23c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:53.094042 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:53.094011 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af8c0af7-5d32-41ac-805d-e89cca22a23c-proc\") pod \"perf-node-gather-daemonset-rzcpk\" (UID: \"af8c0af7-5d32-41ac-805d-e89cca22a23c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:53.094164 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:53.094142 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wd944\" (UniqueName: \"kubernetes.io/projected/af8c0af7-5d32-41ac-805d-e89cca22a23c-kube-api-access-wd944\") pod \"perf-node-gather-daemonset-rzcpk\" (UID: \"af8c0af7-5d32-41ac-805d-e89cca22a23c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:53.094338 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:53.094320 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af8c0af7-5d32-41ac-805d-e89cca22a23c-podres\") pod \"perf-node-gather-daemonset-rzcpk\" (UID: \"af8c0af7-5d32-41ac-805d-e89cca22a23c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:53.094644 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:53.094626 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af8c0af7-5d32-41ac-805d-e89cca22a23c-sys\") pod \"perf-node-gather-daemonset-rzcpk\" (UID: \"af8c0af7-5d32-41ac-805d-e89cca22a23c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:53.094881 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:53.094864 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af8c0af7-5d32-41ac-805d-e89cca22a23c-lib-modules\") pod \"perf-node-gather-daemonset-rzcpk\" (UID: \"af8c0af7-5d32-41ac-805d-e89cca22a23c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:53.095149 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:53.094571 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af8c0af7-5d32-41ac-805d-e89cca22a23c-podres\") pod \"perf-node-gather-daemonset-rzcpk\" (UID: \"af8c0af7-5d32-41ac-805d-e89cca22a23c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:53.095272 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:53.095066 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af8c0af7-5d32-41ac-805d-e89cca22a23c-lib-modules\") pod \"perf-node-gather-daemonset-rzcpk\" (UID: \"af8c0af7-5d32-41ac-805d-e89cca22a23c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:53.095369 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:53.094801 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af8c0af7-5d32-41ac-805d-e89cca22a23c-sys\") pod \"perf-node-gather-daemonset-rzcpk\" (UID: \"af8c0af7-5d32-41ac-805d-e89cca22a23c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:53.102826 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:53.102776 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd944\" (UniqueName: \"kubernetes.io/projected/af8c0af7-5d32-41ac-805d-e89cca22a23c-kube-api-access-wd944\") pod \"perf-node-gather-daemonset-rzcpk\" (UID: \"af8c0af7-5d32-41ac-805d-e89cca22a23c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:53.195741 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:53.195667 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:53.361104 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:53.361076 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk"] Apr 21 04:52:53.366676 ip-10-0-140-11 kubenswrapper[2570]: W0421 04:52:53.366643 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaf8c0af7_5d32_41ac_805d_e89cca22a23c.slice/crio-f70d5032ed526e649f68d6ba1cd0c6c4c1b1a326722ac3d547e558d9d2fc6e37 WatchSource:0}: Error finding container f70d5032ed526e649f68d6ba1cd0c6c4c1b1a326722ac3d547e558d9d2fc6e37: Status 404 returned error can't find the container with id f70d5032ed526e649f68d6ba1cd0c6c4c1b1a326722ac3d547e558d9d2fc6e37 Apr 21 04:52:53.771752 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:53.771657 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" event={"ID":"af8c0af7-5d32-41ac-805d-e89cca22a23c","Type":"ContainerStarted","Data":"75cd1e8695da61075db871acdbb14ed17975171b1e177a5649416fbc60d6aa2a"} Apr 21 04:52:53.771752 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:53.771697 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" event={"ID":"af8c0af7-5d32-41ac-805d-e89cca22a23c","Type":"ContainerStarted","Data":"f70d5032ed526e649f68d6ba1cd0c6c4c1b1a326722ac3d547e558d9d2fc6e37"} Apr 21 04:52:53.771752 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:53.771742 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:52:53.787390 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:53.787336 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" podStartSLOduration=1.787319109 podStartE2EDuration="1.787319109s" podCreationTimestamp="2026-04-21 04:52:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:52:53.786590846 +0000 UTC m=+860.180952605" watchObservedRunningTime="2026-04-21 04:52:53.787319109 +0000 UTC m=+860.181680848" Apr 21 04:52:55.373758 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:55.373728 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-m4stv_3d7b4054-d280-4074-b713-a7fe58a0ee82/dns/0.log" Apr 21 04:52:55.395747 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:55.395709 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-m4stv_3d7b4054-d280-4074-b713-a7fe58a0ee82/kube-rbac-proxy/0.log" Apr 21 04:52:55.550463 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:55.550438 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m6sh5_c70f167b-0eff-4017-9272-7a887e981112/dns-node-resolver/0.log" Apr 21 04:52:56.075056 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:56.075015 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7cpf7_c6ff4930-586a-401d-8bf7-787218f408d0/node-ca/0.log" Apr 21 04:52:56.926137 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:56.926106 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f7gchz_837f93dc-e520-4850-a838-f868fc265b37/istio-proxy/0.log" Apr 21 04:52:57.068957 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:57.068924 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-xrgv9_5e853989-b263-48f1-ae11-7871beb8ab55/istio-proxy/0.log" Apr 21 04:52:57.824343 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:57.824316 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xwsgf_b0ff8417-568b-49f9-adc4-be1ff4ba8ca5/serve-healthcheck-canary/0.log" Apr 21 04:52:58.297392 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:58.297369 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cqh6m_e2f06537-61ad-4db9-9f8b-8b588c0e0f9e/kube-rbac-proxy/0.log" Apr 21 04:52:58.322677 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:58.322656 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cqh6m_e2f06537-61ad-4db9-9f8b-8b588c0e0f9e/exporter/0.log" Apr 21 04:52:58.345054 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:58.345032 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cqh6m_e2f06537-61ad-4db9-9f8b-8b588c0e0f9e/extractor/0.log" Apr 21 04:52:59.784368 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:52:59.784340 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-rzcpk" Apr 21 04:53:00.651265 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:53:00.651239 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-55ddb68486-ws65x_ff22f569-478b-4a9f-b91c-f437a2794ab9/manager/0.log" Apr 21 04:53:02.104877 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:53:02.104847 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-bb95b586d-j9bn2_2b951746-6366-4fed-90d8-dc82961114b2/manager/0.log" Apr 21 04:53:08.129356 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:53:08.129324 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j8dxf_a2b122c8-53b3-4280-9f62-b777ac256ac3/kube-multus-additional-cni-plugins/0.log" Apr 21 04:53:08.152408 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:53:08.152377 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j8dxf_a2b122c8-53b3-4280-9f62-b777ac256ac3/egress-router-binary-copy/0.log" Apr 21 04:53:08.176219 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:53:08.176195 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j8dxf_a2b122c8-53b3-4280-9f62-b777ac256ac3/cni-plugins/0.log" Apr 21 04:53:08.203279 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:53:08.203257 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j8dxf_a2b122c8-53b3-4280-9f62-b777ac256ac3/bond-cni-plugin/0.log" Apr 21 04:53:08.228702 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:53:08.228677 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j8dxf_a2b122c8-53b3-4280-9f62-b777ac256ac3/routeoverride-cni/0.log" Apr 21 04:53:08.252960 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:53:08.252935 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j8dxf_a2b122c8-53b3-4280-9f62-b777ac256ac3/whereabouts-cni-bincopy/0.log" Apr 21 04:53:08.274851 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:53:08.274829 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j8dxf_a2b122c8-53b3-4280-9f62-b777ac256ac3/whereabouts-cni/0.log" Apr 21 04:53:08.491186 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:53:08.491109 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jqrlr_3d76811b-93de-4955-b346-ce731491aa8c/kube-multus/0.log" Apr 21 04:53:08.551916 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:53:08.551889 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-c478k_1500cffd-5994-4d2a-bd36-855f9cf3efe5/network-metrics-daemon/0.log" Apr 21 04:53:08.583330 ip-10-0-140-11 kubenswrapper[2570]: I0421 04:53:08.583305 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-c478k_1500cffd-5994-4d2a-bd36-855f9cf3efe5/kube-rbac-proxy/0.log"