Apr 17 17:04:43.066078 ip-10-0-134-244 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 17:04:43.066087 ip-10-0-134-244 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 17:04:43.066094 ip-10-0-134-244 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 17:04:43.066326 ip-10-0-134-244 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 17:04:53.181150 ip-10-0-134-244 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 17:04:53.181164 ip-10-0-134-244 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot dbde667a2a454e739c0be8287030001f -- Apr 17 17:07:26.283596 ip-10-0-134-244 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:07:26.669332 ip-10-0-134-244 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:07:26.669332 ip-10-0-134-244 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:07:26.669332 ip-10-0-134-244 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:07:26.669332 ip-10-0-134-244 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:07:26.669332 ip-10-0-134-244 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:07:26.671951 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.671869 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:07:26.676490 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676468 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:07:26.676490 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676487 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:07:26.676490 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676493 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:07:26.676490 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676496 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676500 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676503 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676506 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676509 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676512 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676516 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676518 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676521 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676524 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676527 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676530 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676533 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676535 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676538 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676541 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676543 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676546 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676548 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676551 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:07:26.676656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676554 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676556 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676559 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676563 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676567 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676570 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676573 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676576 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676579 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676589 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676593 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676595 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676598 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676600 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676604 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676607 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676610 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676613 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676615 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676619 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:07:26.677129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676622 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676625 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676627 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676630 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676633 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676636 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676638 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676641 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676643 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676646 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676648 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676651 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676654 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676656 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676659 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676661 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676665 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676667 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676670 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676673 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:07:26.677642 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676676 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676678 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676680 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676683 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676686 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676689 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676693 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676696 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676699 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676701 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676704 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676707 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676709 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676712 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676715 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676718 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676721 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676723 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676726 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676728 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:07:26.678128 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676731 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676734 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.676736 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677113 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677118 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677121 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677124 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677126 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677129 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677132 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677136 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677140 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677142 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677146 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677149 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677152 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677154 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677157 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677160 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:07:26.678621 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677162 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677165 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677168 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677170 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677173 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677175 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677178 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677180 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677183 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677186 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677188 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677191 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677193 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677196 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677198 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677200 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677204 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677206 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677209 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677212 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:07:26.679081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677214 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677217 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677220 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677222 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677225 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677227 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677230 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677232 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677235 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677237 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677240 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677242 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677245 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677248 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677250 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677253 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677256 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677259 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677261 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677264 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:07:26.679593 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677266 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677269 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677272 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677274 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677277 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677279 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677283 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677287 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677290 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677292 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677295 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677297 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677300 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677320 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677324 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677328 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677331 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677334 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677337 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677339 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:07:26.680081 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677342 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677344 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677349 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677352 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677355 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677357 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677360 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677362 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677365 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.677367 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678419 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678430 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678437 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678442 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678447 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678451 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678455 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678463 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678467 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678470 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678474 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678477 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:07:26.680600 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678480 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678483 2568 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678486 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678489 2568 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678495 2568 flags.go:64] FLAG: --cloud-config="" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678498 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678501 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678506 2568 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678508 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678512 2568 flags.go:64] FLAG: --config-dir="" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678515 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678518 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678522 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678525 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678528 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678532 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678535 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678538 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678541 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678544 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678547 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678551 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678554 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678557 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678560 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:07:26.681131 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678563 2568 flags.go:64] FLAG: --enable-server="true" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678566 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678571 2568 flags.go:64] FLAG: --event-burst="100" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678574 2568 flags.go:64] FLAG: --event-qps="50" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678577 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678580 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678583 2568 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678587 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678590 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678594 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678597 2568 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678602 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678605 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678608 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678611 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678614 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678616 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678619 2568 flags.go:64] FLAG: --feature-gates="" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678623 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678627 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678630 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678633 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678636 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678639 2568 flags.go:64] FLAG: --help="false" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678642 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-134-244.ec2.internal" Apr 17 17:07:26.681805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678646 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678649 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678652 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678656 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678659 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678662 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678665 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678668 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678671 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678674 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678678 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678681 2568 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678684 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678687 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678690 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678693 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678696 2568 flags.go:64] FLAG: --lock-file="" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678698 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678703 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678706 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678711 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678714 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678717 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678720 2568 flags.go:64] FLAG: --logging-format="text" Apr 17 17:07:26.682417 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678723 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678726 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678729 2568 flags.go:64] FLAG: --manifest-url="" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678732 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678737 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678740 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678743 2568 flags.go:64] FLAG: --max-pods="110" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678746 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678749 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678752 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678756 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678759 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678762 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678765 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678772 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678775 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678778 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678782 2568 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678785 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678790 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678793 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678798 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678801 2568 flags.go:64] FLAG: --port="10250" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678804 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:07:26.682997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678807 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a6080f48c46eb965" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678810 2568 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678815 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678818 2568 flags.go:64] FLAG: --register-node="true" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678821 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678824 2568 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678828 2568 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678830 2568 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678833 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678836 2568 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678840 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678843 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678846 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678849 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678852 2568 flags.go:64] FLAG: --runonce="false" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678855 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678858 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678861 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678864 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678867 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678870 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678873 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678876 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678879 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678882 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678884 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:07:26.683639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678888 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678891 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678893 2568 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678898 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678903 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678906 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678909 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678913 2568 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678917 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678920 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678923 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678926 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678929 2568 flags.go:64] FLAG: --v="2" Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678933 2568 flags.go:64] FLAG: --version="false" Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678937 2568 flags.go:64] FLAG: --vmodule="" Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678941 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.678944 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679038 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679042 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679045 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679048 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679050 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679053 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679055 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:07:26.684255 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679058 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679060 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679063 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679065 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679068 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679071 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679073 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679076 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679079 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679082 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679085 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679089 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679092 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679095 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679097 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679100 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679104 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679107 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679109 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679112 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:07:26.684867 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679114 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679117 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679120 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679123 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679125 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679128 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679131 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679133 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679136 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679138 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679141 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679143 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679146 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679148 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679151 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679158 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679160 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679163 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679166 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679168 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:07:26.685421 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679170 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679173 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679176 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679179 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679182 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679184 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679187 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679190 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679194 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679196 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679199 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679201 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679204 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679207 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679209 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679212 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679215 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679218 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679220 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679223 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:07:26.685923 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679225 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679229 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679233 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679235 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679238 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679240 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679243 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679246 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679248 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679251 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679253 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679256 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679259 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679261 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679264 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679268 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679272 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679275 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:07:26.686436 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.679278 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:07:26.686914 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.679287 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:07:26.686914 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.686054 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:07:26.686914 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.686069 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:07:26.686914 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686117 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:07:26.686914 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686121 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:07:26.686914 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686125 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:07:26.686914 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686129 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:07:26.686914 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686132 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:07:26.686914 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686135 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:07:26.686914 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686139 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:07:26.686914 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686143 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:07:26.686914 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686147 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:07:26.686914 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686150 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:07:26.686914 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686152 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:07:26.686914 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686155 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686157 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686160 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686163 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686165 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686168 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686170 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686173 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686176 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686178 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686181 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686183 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686186 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686188 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686192 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686194 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686197 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686199 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686202 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686204 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:07:26.687283 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686208 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686211 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686213 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686216 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686219 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686221 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686224 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686226 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686229 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686232 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686234 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686236 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686239 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686241 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686244 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686246 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686249 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686251 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686254 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686257 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:07:26.687841 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686259 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686262 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686264 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686267 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686269 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686272 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686274 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686277 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686280 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686282 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686285 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686288 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686291 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686293 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686296 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686299 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686314 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686316 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686319 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686322 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:07:26.688344 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686325 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:07:26.688819 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686327 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:07:26.688819 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686330 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:07:26.688819 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686332 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:07:26.688819 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686335 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:07:26.688819 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686338 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:07:26.688819 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686341 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:07:26.688819 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686343 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:07:26.688819 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686347 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:07:26.688819 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686350 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:07:26.688819 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686353 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:07:26.688819 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686355 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:07:26.688819 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686358 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:07:26.688819 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686360 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:07:26.688819 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686363 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:07:26.688819 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.686368 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:07:26.688819 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686462 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686466 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686468 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686471 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686474 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686477 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686479 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686481 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686485 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686488 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686491 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686493 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686496 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686498 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686501 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686503 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686506 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686508 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686511 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686514 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:07:26.689216 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686516 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686518 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686521 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686523 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686526 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686528 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686531 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686533 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686536 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686538 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686541 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686543 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686546 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686548 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686551 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686554 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686556 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686561 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686564 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:07:26.689709 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686567 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686571 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686574 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686577 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686580 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686583 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686586 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686588 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686591 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686593 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686595 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686598 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686601 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686603 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686605 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686608 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686611 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686613 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686616 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686618 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:07:26.690684 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686620 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686623 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686625 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686628 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686630 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686633 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686636 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686638 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686641 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686643 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686646 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686648 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686651 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686654 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686657 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686659 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686662 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686664 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686667 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686670 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:07:26.691185 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686672 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:07:26.691724 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686674 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:07:26.691724 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686677 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:07:26.691724 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686679 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:07:26.691724 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686682 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:07:26.691724 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686684 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:07:26.691724 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:26.686688 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:07:26.691724 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.686693 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:07:26.691724 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.687360 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:07:26.691724 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.691020 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:07:26.692013 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.691808 2568 server.go:1019] "Starting client certificate rotation" Apr 17 17:07:26.692013 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.691921 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:07:26.692013 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.691976 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:07:26.716544 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.716523 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:07:26.718762 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.718740 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:07:26.733322 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.733295 2568 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:07:26.738776 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.738755 2568 log.go:25] "Validated CRI v1 image API" Apr 17 17:07:26.739927 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.739911 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:07:26.744045 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.744026 2568 fs.go:135] Filesystem UUIDs: map[5ad543c2-7517-4b27-ac82-f5d0a19c737c:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 f78a4c0f-3e9d-4c45-a97d-cf484479a815:/dev/nvme0n1p4] Apr 17 17:07:26.744110 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.744044 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:07:26.746793 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.746775 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:07:26.749551 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.749437 2568 manager.go:217] Machine: {Timestamp:2026-04-17 17:07:26.747682088 +0000 UTC m=+0.363815342 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099184 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2de5121f7f3000adec5043b63ca2d3 SystemUUID:ec2de512-1f7f-3000-adec-5043b63ca2d3 BootID:dbde667a-2a45-4e73-9c0b-e8287030001f Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b7:d4:d2:1b:d5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b7:d4:d2:1b:d5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:e2:0f:35:be:bb:47 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:07:26.749551 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.749543 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:07:26.749684 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.749624 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:07:26.750584 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.750558 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:07:26.750733 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.750586 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-244.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:07:26.750779 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.750743 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:07:26.750779 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.750752 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:07:26.750779 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.750765 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:07:26.751333 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.751323 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:07:26.752642 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.752632 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:07:26.752745 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.752736 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:07:26.755769 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.755758 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:07:26.755815 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.755772 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:07:26.755815 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.755786 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:07:26.755815 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.755795 2568 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:07:26.755815 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.755812 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:07:26.756808 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.756796 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:07:26.756851 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.756815 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:07:26.760776 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.760752 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:07:26.762337 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.762323 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:07:26.763961 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.763942 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:07:26.764027 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.763964 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:07:26.764027 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.763971 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:07:26.764027 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.763977 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:07:26.764027 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.763983 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:07:26.764027 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.763990 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:07:26.764027 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.763996 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:07:26.764027 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.764005 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:07:26.764027 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.764014 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:07:26.764027 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.764021 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:07:26.764027 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.764030 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:07:26.764285 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.764039 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:07:26.764907 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.764896 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:07:26.764940 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.764908 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:07:26.767893 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.767876 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-244.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 17:07:26.767971 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:26.767950 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 17:07:26.768023 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:26.768004 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-244.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 17:07:26.768661 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.768650 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:07:26.768691 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.768684 2568 server.go:1295] "Started kubelet" Apr 17 17:07:26.768795 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.768767 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:07:26.768904 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.768817 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:07:26.768904 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.768879 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:07:26.769469 ip-10-0-134-244 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:07:26.770029 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.769907 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:07:26.770960 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.770946 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:07:26.775602 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.775577 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:07:26.775997 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.775980 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:07:26.776651 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.776623 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:07:26.776731 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.776663 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:07:26.776731 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:26.776693 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 17 17:07:26.776836 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.776765 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:07:26.776836 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.776773 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:07:26.776934 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.776626 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:07:26.776984 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.776967 2568 factory.go:55] Registering systemd factory Apr 17 17:07:26.777033 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.777022 2568 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:07:26.778746 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.778361 2568 factory.go:153] Registering CRI-O factory Apr 17 17:07:26.778746 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.778377 2568 factory.go:223] Registration of the crio container factory successfully Apr 17 17:07:26.778746 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.778425 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:07:26.778746 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.778448 2568 factory.go:103] Registering Raw factory Apr 17 17:07:26.778746 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.778461 2568 manager.go:1196] Started watching for new ooms in manager Apr 17 17:07:26.779014 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.778852 2568 manager.go:319] Starting recovery of all containers Apr 17 17:07:26.781320 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:26.781275 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:07:26.783558 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:26.783537 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 17:07:26.783633 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:26.783582 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-244.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 17:07:26.784635 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:26.783665 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-244.ec2.internal.18a733e75a6f64e2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-244.ec2.internal,UID:ip-10-0-134-244.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-244.ec2.internal,},FirstTimestamp:2026-04-17 17:07:26.76866173 +0000 UTC m=+0.384794983,LastTimestamp:2026-04-17 17:07:26.76866173 +0000 UTC m=+0.384794983,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-244.ec2.internal,}" Apr 17 17:07:26.788794 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.788768 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2cj56" Apr 17 17:07:26.791680 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.791530 2568 manager.go:324] Recovery completed Apr 17 17:07:26.796273 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.796260 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:26.798658 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.798639 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2cj56" Apr 17 17:07:26.798658 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.798652 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:26.798799 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.798684 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:26.798799 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.798694 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:26.799183 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.799168 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:07:26.799183 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.799181 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:07:26.799267 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.799200 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:07:26.800534 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:26.800476 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-244.ec2.internal.18a733e75c39362e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-244.ec2.internal,UID:ip-10-0-134-244.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-244.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-244.ec2.internal,},FirstTimestamp:2026-04-17 17:07:26.798665262 +0000 UTC m=+0.414798516,LastTimestamp:2026-04-17 17:07:26.798665262 +0000 UTC m=+0.414798516,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-244.ec2.internal,}" Apr 17 17:07:26.801932 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.801920 2568 policy_none.go:49] "None policy: Start" Apr 17 17:07:26.801990 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.801936 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:07:26.801990 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.801945 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:07:26.839078 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.838882 2568 manager.go:341] "Starting Device Plugin manager" Apr 17 17:07:26.839078 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:26.838907 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:07:26.839078 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.838917 2568 server.go:85] "Starting device plugin registration server" Apr 17 17:07:26.839262 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.839131 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:07:26.839262 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.839144 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:07:26.839262 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.839249 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:07:26.839425 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.839357 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:07:26.839425 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.839367 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:07:26.839807 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:26.839786 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:07:26.839871 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:26.839832 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-244.ec2.internal\" not found" Apr 17 17:07:26.899220 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.899178 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:07:26.900353 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.900337 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:07:26.900451 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.900358 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:07:26.900451 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.900375 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:07:26.900451 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.900381 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:07:26.900451 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:26.900417 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:07:26.903508 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.903484 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:26.939411 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.939359 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:26.940135 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.940120 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:26.940184 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.940150 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:26.940184 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.940161 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:26.940184 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.940182 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-244.ec2.internal" Apr 17 17:07:26.948620 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:26.948601 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-244.ec2.internal" Apr 17 17:07:26.948683 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:26.948629 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-244.ec2.internal\": node \"ip-10-0-134-244.ec2.internal\" not found" Apr 17 17:07:26.971639 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:26.971615 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 17 17:07:27.001367 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.001346 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal"] Apr 17 17:07:27.001446 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.001413 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:27.002132 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.002119 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:27.002198 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.002143 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:27.002198 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.002154 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:27.003460 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.003447 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:27.003634 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.003620 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 17 17:07:27.003694 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.003661 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:27.004486 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.004467 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:27.004553 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.004488 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:27.004553 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.004514 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:27.004553 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.004523 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:27.004648 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.004493 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:27.004678 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.004660 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:27.006015 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.006002 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" Apr 17 17:07:27.006098 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.006029 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:27.006734 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.006721 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:27.006832 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.006749 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:27.006832 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.006764 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:27.030895 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:27.030878 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-244.ec2.internal\" not found" node="ip-10-0-134-244.ec2.internal" Apr 17 17:07:27.035279 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:27.035264 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-244.ec2.internal\" not found" node="ip-10-0-134-244.ec2.internal" Apr 17 17:07:27.071836 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:27.071818 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 17 17:07:27.078136 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.078115 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1c6e032a9cd3412b502d428b8f5c545c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal\" (UID: \"1c6e032a9cd3412b502d428b8f5c545c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 17 17:07:27.078197 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.078141 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c6e032a9cd3412b502d428b8f5c545c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal\" (UID: \"1c6e032a9cd3412b502d428b8f5c545c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 17 17:07:27.078197 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.078158 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/009f099669c612b1a9a7e8809b1d3526-config\") pod \"kube-apiserver-proxy-ip-10-0-134-244.ec2.internal\" (UID: \"009f099669c612b1a9a7e8809b1d3526\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" Apr 17 17:07:27.172461 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:27.172430 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 17 17:07:27.178789 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.178771 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c6e032a9cd3412b502d428b8f5c545c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal\" (UID: \"1c6e032a9cd3412b502d428b8f5c545c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 17 17:07:27.178843 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.178795 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/009f099669c612b1a9a7e8809b1d3526-config\") pod \"kube-apiserver-proxy-ip-10-0-134-244.ec2.internal\" (UID: \"009f099669c612b1a9a7e8809b1d3526\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" Apr 17 17:07:27.178843 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.178813 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1c6e032a9cd3412b502d428b8f5c545c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal\" (UID: \"1c6e032a9cd3412b502d428b8f5c545c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 17 17:07:27.178904 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.178856 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c6e032a9cd3412b502d428b8f5c545c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal\" (UID: \"1c6e032a9cd3412b502d428b8f5c545c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 17 17:07:27.178904 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.178886 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/009f099669c612b1a9a7e8809b1d3526-config\") pod \"kube-apiserver-proxy-ip-10-0-134-244.ec2.internal\" (UID: \"009f099669c612b1a9a7e8809b1d3526\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" Apr 17 17:07:27.178904 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.178858 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1c6e032a9cd3412b502d428b8f5c545c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal\" (UID: \"1c6e032a9cd3412b502d428b8f5c545c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 17 17:07:27.273181 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:27.273159 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 17 17:07:27.332604 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.332578 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 17 17:07:27.338017 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.338002 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" Apr 17 17:07:27.374157 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:27.374139 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 17 17:07:27.474653 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:27.474629 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 17 17:07:27.575243 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:27.575179 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 17 17:07:27.675699 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:27.675673 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 17 17:07:27.692164 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.692142 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:07:27.692285 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.692270 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:07:27.776403 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:27.776369 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 17 17:07:27.776403 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.776389 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:07:27.785505 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.785484 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:07:27.786585 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.786570 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:27.801503 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.801480 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:02:26 +0000 UTC" deadline="2027-12-02 07:41:41.166026252 +0000 UTC" Apr 17 17:07:27.801503 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.801503 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14246h34m13.364526015s" Apr 17 17:07:27.809147 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.809130 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-ppcst" Apr 17 17:07:27.816537 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.816516 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-ppcst" Apr 17 17:07:27.877159 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:27.877136 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 17 17:07:27.885203 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:27.885154 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c6e032a9cd3412b502d428b8f5c545c.slice/crio-26dcefde6241fa916c19a219f8ce2a94c514ad1aa5f047d7d610ac5f20276dcb WatchSource:0}: Error finding container 26dcefde6241fa916c19a219f8ce2a94c514ad1aa5f047d7d610ac5f20276dcb: Status 404 returned error can't find the container with id 26dcefde6241fa916c19a219f8ce2a94c514ad1aa5f047d7d610ac5f20276dcb Apr 17 17:07:27.885518 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:27.885497 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod009f099669c612b1a9a7e8809b1d3526.slice/crio-9b9a1f1f86289542ca3bf1281574bd67e694a0001586af334017cb7a3ebe74da WatchSource:0}: Error finding container 9b9a1f1f86289542ca3bf1281574bd67e694a0001586af334017cb7a3ebe74da: Status 404 returned error can't find the container with id 9b9a1f1f86289542ca3bf1281574bd67e694a0001586af334017cb7a3ebe74da Apr 17 17:07:27.889061 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.889043 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:07:27.903916 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.903878 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" event={"ID":"009f099669c612b1a9a7e8809b1d3526","Type":"ContainerStarted","Data":"9b9a1f1f86289542ca3bf1281574bd67e694a0001586af334017cb7a3ebe74da"} Apr 17 17:07:27.904770 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:27.904745 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" event={"ID":"1c6e032a9cd3412b502d428b8f5c545c","Type":"ContainerStarted","Data":"26dcefde6241fa916c19a219f8ce2a94c514ad1aa5f047d7d610ac5f20276dcb"} Apr 17 17:07:27.978008 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:27.977983 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 17 17:07:28.008573 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.008551 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:28.076898 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.076873 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 17 17:07:28.089204 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.089184 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:07:28.090708 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.090696 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" Apr 17 17:07:28.099100 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.099086 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:07:28.125405 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.123986 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:28.649628 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.649591 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:28.757225 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.757188 2568 apiserver.go:52] "Watching apiserver" Apr 17 17:07:28.766028 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.766003 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:07:28.766470 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.766438 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4xcb9","openshift-network-diagnostics/network-check-target-z2wnp","openshift-ovn-kubernetes/ovnkube-node-97rxf","kube-system/konnectivity-agent-n49j7","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq","openshift-dns/node-resolver-4vjjz","openshift-image-registry/node-ca-j5pdx","openshift-multus/multus-additional-cni-plugins-b66zf","openshift-network-operator/iptables-alerter-l8qbp","kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal","openshift-cluster-node-tuning-operator/tuned-g6bld","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal","openshift-multus/multus-v68m9"] Apr 17 17:07:28.768086 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.768058 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j5pdx" Apr 17 17:07:28.770559 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.770525 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.770657 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.770604 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-n49j7" Apr 17 17:07:28.770990 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.770968 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:07:28.771080 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.771019 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:07:28.771140 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.771092 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:07:28.771194 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.771148 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2thgw\"" Apr 17 17:07:28.771734 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.771715 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:28.771817 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:28.771790 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xcb9" podUID="62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89" Apr 17 17:07:28.772850 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.772831 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.773149 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.773100 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:07:28.773878 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.773808 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:07:28.773960 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.773899 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:07:28.774236 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.774105 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-xdltp\"" Apr 17 17:07:28.774236 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.774165 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-2br7m\"" Apr 17 17:07:28.774608 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.774569 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:07:28.774716 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.774698 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:07:28.774823 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.774784 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:07:28.775385 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.774990 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:07:28.775385 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.775073 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4vjjz" Apr 17 17:07:28.775689 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.775670 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:07:28.775969 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.775952 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-vzwnp\"" Apr 17 17:07:28.776169 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.776153 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:07:28.776249 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.776186 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:07:28.777198 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.777146 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.778344 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.778329 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:28.778423 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.778398 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-l8qbp" Apr 17 17:07:28.778490 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:28.778393 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wnp" podUID="8e113e09-f336-4b95-a2e1-db3f043106af" Apr 17 17:07:28.779128 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.779109 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:07:28.779128 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.779117 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:07:28.779252 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.779156 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mnfwx\"" Apr 17 17:07:28.779651 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.779635 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.781243 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.781216 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:07:28.781420 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.781228 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kln2x\"" Apr 17 17:07:28.781420 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.781405 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:07:28.781540 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.781466 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:07:28.781922 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.781902 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:07:28.782004 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.781960 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:07:28.782149 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.782110 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-s2gp7\"" Apr 17 17:07:28.782239 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.782166 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:07:28.782295 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.782272 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:07:28.782446 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.782405 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:07:28.782508 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.782412 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:07:28.782851 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.782817 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:07:28.783114 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.783098 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:07:28.783333 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.783295 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-kcdpt\"" Apr 17 17:07:28.783816 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.783798 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.786318 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.786290 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9qw6x\"" Apr 17 17:07:28.786406 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.786290 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:07:28.788348 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788324 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-cni-binary-copy\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.788441 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788365 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-kubelet\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.788441 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788391 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-slash\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.788441 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788413 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-run-ovn\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.788441 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788428 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3868d9bd-d06e-4b37-89c0-ab0c05df3fff-host\") pod \"node-ca-j5pdx\" (UID: \"3868d9bd-d06e-4b37-89c0-ab0c05df3fff\") " pod="openshift-image-registry/node-ca-j5pdx" Apr 17 17:07:28.788441 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788442 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.788692 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788482 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-registration-dir\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.788692 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788517 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-run-systemd\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.788692 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788557 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d848a18d-f010-4ec0-898d-c9d149265ab6-ovn-node-metrics-cert\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.788692 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788582 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95t9c\" (UniqueName: \"kubernetes.io/projected/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-kube-api-access-95t9c\") pod \"network-metrics-daemon-4xcb9\" (UID: \"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89\") " pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:28.788692 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788608 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-socket-dir\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.788692 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788654 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-etc-selinux\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.788692 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788676 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/31497e40-1e80-4723-9e91-a3e5cfedce92-agent-certs\") pod \"konnectivity-agent-n49j7\" (UID: \"31497e40-1e80-4723-9e91-a3e5cfedce92\") " pod="kube-system/konnectivity-agent-n49j7" Apr 17 17:07:28.789015 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788699 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-cni-netd\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.789015 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788736 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa447df9-716a-47c2-9ffb-b819a566f787-tmp-dir\") pod \"node-resolver-4vjjz\" (UID: \"fa447df9-716a-47c2-9ffb-b819a566f787\") " pod="openshift-dns/node-resolver-4vjjz" Apr 17 17:07:28.789015 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788773 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/31497e40-1e80-4723-9e91-a3e5cfedce92-konnectivity-ca\") pod \"konnectivity-agent-n49j7\" (UID: \"31497e40-1e80-4723-9e91-a3e5cfedce92\") " pod="kube-system/konnectivity-agent-n49j7" Apr 17 17:07:28.789015 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788790 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5csn\" (UniqueName: \"kubernetes.io/projected/3868d9bd-d06e-4b37-89c0-ab0c05df3fff-kube-api-access-l5csn\") pod \"node-ca-j5pdx\" (UID: \"3868d9bd-d06e-4b37-89c0-ab0c05df3fff\") " pod="openshift-image-registry/node-ca-j5pdx" Apr 17 17:07:28.789015 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788810 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-var-lib-openvswitch\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.789015 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788832 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-node-log\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.789015 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788858 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-cni-bin\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.789015 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788881 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-os-release\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.789015 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788903 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.789015 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788927 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs5nf\" (UniqueName: \"kubernetes.io/projected/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-kube-api-access-qs5nf\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.789015 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788951 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fa447df9-716a-47c2-9ffb-b819a566f787-hosts-file\") pod \"node-resolver-4vjjz\" (UID: \"fa447df9-716a-47c2-9ffb-b819a566f787\") " pod="openshift-dns/node-resolver-4vjjz" Apr 17 17:07:28.789015 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.788989 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-run-netns\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.789586 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789022 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d848a18d-f010-4ec0-898d-c9d149265ab6-ovnkube-config\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.789586 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789050 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-device-dir\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.789586 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789081 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqmbv\" (UniqueName: \"kubernetes.io/projected/fa447df9-716a-47c2-9ffb-b819a566f787-kube-api-access-wqmbv\") pod \"node-resolver-4vjjz\" (UID: \"fa447df9-716a-47c2-9ffb-b819a566f787\") " pod="openshift-dns/node-resolver-4vjjz" Apr 17 17:07:28.789586 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789104 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-etc-openvswitch\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.789586 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789127 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-log-socket\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.789586 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789142 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntjc2\" (UniqueName: \"kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2\") pod \"network-check-target-z2wnp\" (UID: \"8e113e09-f336-4b95-a2e1-db3f043106af\") " pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:28.789586 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789161 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.789586 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789179 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs\") pod \"network-metrics-daemon-4xcb9\" (UID: \"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89\") " pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:28.789586 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789193 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zs2b\" (UniqueName: \"kubernetes.io/projected/df7ed8d3-25de-4552-b8a6-d2602eedf81d-kube-api-access-5zs2b\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.789586 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789210 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.789586 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789231 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d848a18d-f010-4ec0-898d-c9d149265ab6-ovnkube-script-lib\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.789586 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789247 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-system-cni-dir\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.789586 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789261 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-cnibin\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.789586 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789288 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.789586 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789330 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-sys-fs\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.789586 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789355 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-systemd-units\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.790214 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789376 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-run-openvswitch\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.790214 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789420 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-run-ovn-kubernetes\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.790214 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789448 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d848a18d-f010-4ec0-898d-c9d149265ab6-env-overrides\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.790214 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789474 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ghlf\" (UniqueName: \"kubernetes.io/projected/d848a18d-f010-4ec0-898d-c9d149265ab6-kube-api-access-2ghlf\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.790214 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.789497 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3868d9bd-d06e-4b37-89c0-ab0c05df3fff-serviceca\") pod \"node-ca-j5pdx\" (UID: \"3868d9bd-d06e-4b37-89c0-ab0c05df3fff\") " pod="openshift-image-registry/node-ca-j5pdx" Apr 17 17:07:28.817079 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.817037 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:02:27 +0000 UTC" deadline="2028-01-15 18:48:58.065901984 +0000 UTC" Apr 17 17:07:28.817079 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.817077 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15313h41m29.248829254s" Apr 17 17:07:28.878135 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.878113 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:07:28.890535 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890506 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.890672 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890550 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-system-cni-dir\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.890672 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890577 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-cnibin\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.890672 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890579 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.890672 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890600 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-sys-fs\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.890672 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890628 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-systemd\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.890672 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890640 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-system-cni-dir\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.890672 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890654 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-run-ovn-kubernetes\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.890672 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890666 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-cnibin\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.891051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890681 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-sys-fs\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.891051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890693 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d848a18d-f010-4ec0-898d-c9d149265ab6-env-overrides\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.891051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890700 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-run-ovn-kubernetes\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.891051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890720 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3868d9bd-d06e-4b37-89c0-ab0c05df3fff-serviceca\") pod \"node-ca-j5pdx\" (UID: \"3868d9bd-d06e-4b37-89c0-ab0c05df3fff\") " pod="openshift-image-registry/node-ca-j5pdx" Apr 17 17:07:28.891051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890767 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-run-multus-certs\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.891051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890812 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-kubelet\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.891051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890839 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3868d9bd-d06e-4b37-89c0-ab0c05df3fff-host\") pod \"node-ca-j5pdx\" (UID: \"3868d9bd-d06e-4b37-89c0-ab0c05df3fff\") " pod="openshift-image-registry/node-ca-j5pdx" Apr 17 17:07:28.891051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890864 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-kubelet\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.891051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890872 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.891051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890898 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-registration-dir\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.891051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890923 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-var-lib-kubelet\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.891051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890952 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3868d9bd-d06e-4b37-89c0-ab0c05df3fff-host\") pod \"node-ca-j5pdx\" (UID: \"3868d9bd-d06e-4b37-89c0-ab0c05df3fff\") " pod="openshift-image-registry/node-ca-j5pdx" Apr 17 17:07:28.891051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890965 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-run-systemd\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.891051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.890984 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.891051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891023 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95t9c\" (UniqueName: \"kubernetes.io/projected/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-kube-api-access-95t9c\") pod \"network-metrics-daemon-4xcb9\" (UID: \"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89\") " pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:28.891051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891049 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-sys\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.891811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891061 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-registration-dir\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.891811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891085 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-cnibin\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.891811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891108 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-run-systemd\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.891811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891224 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3868d9bd-d06e-4b37-89c0-ab0c05df3fff-serviceca\") pod \"node-ca-j5pdx\" (UID: \"3868d9bd-d06e-4b37-89c0-ab0c05df3fff\") " pod="openshift-image-registry/node-ca-j5pdx" Apr 17 17:07:28.891811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891238 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-os-release\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.891811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891245 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d848a18d-f010-4ec0-898d-c9d149265ab6-env-overrides\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.891811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891280 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-cni-netd\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.891811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891319 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-cni-netd\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.891811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891329 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/31497e40-1e80-4723-9e91-a3e5cfedce92-konnectivity-ca\") pod \"konnectivity-agent-n49j7\" (UID: \"31497e40-1e80-4723-9e91-a3e5cfedce92\") " pod="kube-system/konnectivity-agent-n49j7" Apr 17 17:07:28.891811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891357 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/313ee430-9d15-42be-9216-97b917fe295e-iptables-alerter-script\") pod \"iptables-alerter-l8qbp\" (UID: \"313ee430-9d15-42be-9216-97b917fe295e\") " pod="openshift-network-operator/iptables-alerter-l8qbp" Apr 17 17:07:28.891811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891383 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-host\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.891811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891406 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-var-lib-cni-bin\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.891811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891448 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-multus-conf-dir\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.891811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891539 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3a952143-3965-4339-ba83-4a96e9b34841-multus-daemon-config\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.891811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891576 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5csn\" (UniqueName: \"kubernetes.io/projected/3868d9bd-d06e-4b37-89c0-ab0c05df3fff-kube-api-access-l5csn\") pod \"node-ca-j5pdx\" (UID: \"3868d9bd-d06e-4b37-89c0-ab0c05df3fff\") " pod="openshift-image-registry/node-ca-j5pdx" Apr 17 17:07:28.891811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891601 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-cni-bin\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.891811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891626 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-os-release\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.892582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891652 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.892582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891696 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-cni-bin\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.892582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891731 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qs5nf\" (UniqueName: \"kubernetes.io/projected/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-kube-api-access-qs5nf\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.892582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891737 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-os-release\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.892582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891753 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqmbv\" (UniqueName: \"kubernetes.io/projected/fa447df9-716a-47c2-9ffb-b819a566f787-kube-api-access-wqmbv\") pod \"node-resolver-4vjjz\" (UID: \"fa447df9-716a-47c2-9ffb-b819a566f787\") " pod="openshift-dns/node-resolver-4vjjz" Apr 17 17:07:28.892582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891775 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-device-dir\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.892582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891778 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.892582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891792 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-sysctl-d\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.892582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891835 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-device-dir\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.892582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891839 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-var-lib-cni-multus\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.892582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891873 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/31497e40-1e80-4723-9e91-a3e5cfedce92-konnectivity-ca\") pod \"konnectivity-agent-n49j7\" (UID: \"31497e40-1e80-4723-9e91-a3e5cfedce92\") " pod="kube-system/konnectivity-agent-n49j7" Apr 17 17:07:28.892582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891876 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-etc-openvswitch\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.892582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891904 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-etc-openvswitch\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.892582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891919 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntjc2\" (UniqueName: \"kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2\") pod \"network-check-target-z2wnp\" (UID: \"8e113e09-f336-4b95-a2e1-db3f043106af\") " pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:28.892582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891950 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.892582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.891975 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs\") pod \"network-metrics-daemon-4xcb9\" (UID: \"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89\") " pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:28.892582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892001 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-modprobe-d\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.893296 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892027 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d848a18d-f010-4ec0-898d-c9d149265ab6-ovnkube-script-lib\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.893296 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892051 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.893296 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892092 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqq48\" (UniqueName: \"kubernetes.io/projected/313ee430-9d15-42be-9216-97b917fe295e-kube-api-access-qqq48\") pod \"iptables-alerter-l8qbp\" (UID: \"313ee430-9d15-42be-9216-97b917fe295e\") " pod="openshift-network-operator/iptables-alerter-l8qbp" Apr 17 17:07:28.893296 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892135 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-systemd-units\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.893296 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892163 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-run-openvswitch\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.893296 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892196 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ghlf\" (UniqueName: \"kubernetes.io/projected/d848a18d-f010-4ec0-898d-c9d149265ab6-kube-api-access-2ghlf\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.893296 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892209 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-systemd-units\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.893296 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892221 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-cni-binary-copy\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.893296 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892247 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-var-lib-kubelet\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.893296 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892251 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-run-openvswitch\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.893296 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892272 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3a952143-3965-4339-ba83-4a96e9b34841-cni-binary-copy\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.893296 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892297 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-slash\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.893296 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892339 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-run-ovn\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.893296 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:28.892166 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:28.893296 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892439 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-slash\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.893296 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:28.892472 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs podName:62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:29.392431664 +0000 UTC m=+3.008564904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs") pod "network-metrics-daemon-4xcb9" (UID: "62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:28.893296 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892383 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-lib-modules\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.893967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892485 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-run-ovn\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.893967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892501 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-tmp\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.893967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892524 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d848a18d-f010-4ec0-898d-c9d149265ab6-ovn-node-metrics-cert\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.893967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892524 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.893967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892548 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-socket-dir\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.893967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892582 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-etc-selinux\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.893967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892584 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d848a18d-f010-4ec0-898d-c9d149265ab6-ovnkube-script-lib\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.893967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892645 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/31497e40-1e80-4723-9e91-a3e5cfedce92-agent-certs\") pod \"konnectivity-agent-n49j7\" (UID: \"31497e40-1e80-4723-9e91-a3e5cfedce92\") " pod="kube-system/konnectivity-agent-n49j7" Apr 17 17:07:28.893967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892637 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-socket-dir\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.893967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892700 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/313ee430-9d15-42be-9216-97b917fe295e-host-slash\") pod \"iptables-alerter-l8qbp\" (UID: \"313ee430-9d15-42be-9216-97b917fe295e\") " pod="openshift-network-operator/iptables-alerter-l8qbp" Apr 17 17:07:28.893967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892700 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/df7ed8d3-25de-4552-b8a6-d2602eedf81d-etc-selinux\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.893967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892718 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-cni-binary-copy\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.893967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892743 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-run\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.893967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892949 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:07:28.893967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.892776 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-tuned\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.893967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893229 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa447df9-716a-47c2-9ffb-b819a566f787-tmp-dir\") pod \"node-resolver-4vjjz\" (UID: \"fa447df9-716a-47c2-9ffb-b819a566f787\") " pod="openshift-dns/node-resolver-4vjjz" Apr 17 17:07:28.893967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893255 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-sysctl-conf\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.894722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893274 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-system-cni-dir\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.894722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893294 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-multus-cni-dir\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.894722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893329 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-multus-socket-dir-parent\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.894722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893354 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-run-k8s-cni-cncf-io\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.894722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893375 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-run-netns\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.894722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893399 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-hostroot\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.894722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893428 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-var-lib-openvswitch\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.894722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893453 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-node-log\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.894722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893469 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.894722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893479 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fa447df9-716a-47c2-9ffb-b819a566f787-hosts-file\") pod \"node-resolver-4vjjz\" (UID: \"fa447df9-716a-47c2-9ffb-b819a566f787\") " pod="openshift-dns/node-resolver-4vjjz" Apr 17 17:07:28.894722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893504 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5jh5\" (UniqueName: \"kubernetes.io/projected/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-kube-api-access-w5jh5\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.894722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893550 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-run-netns\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.894722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893557 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-var-lib-openvswitch\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.894722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893576 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d848a18d-f010-4ec0-898d-c9d149265ab6-ovnkube-config\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.894722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893625 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fa447df9-716a-47c2-9ffb-b819a566f787-hosts-file\") pod \"node-resolver-4vjjz\" (UID: \"fa447df9-716a-47c2-9ffb-b819a566f787\") " pod="openshift-dns/node-resolver-4vjjz" Apr 17 17:07:28.894722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893653 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-etc-kubernetes\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.894722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893676 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6v5v\" (UniqueName: \"kubernetes.io/projected/3a952143-3965-4339-ba83-4a96e9b34841-kube-api-access-p6v5v\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.895449 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893676 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-host-run-netns\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.895449 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893702 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-log-socket\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.895449 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893731 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-log-socket\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.895449 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893741 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zs2b\" (UniqueName: \"kubernetes.io/projected/df7ed8d3-25de-4552-b8a6-d2602eedf81d-kube-api-access-5zs2b\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.895449 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893783 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-sysconfig\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.895449 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893786 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d848a18d-f010-4ec0-898d-c9d149265ab6-node-log\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.895449 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.893809 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-kubernetes\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.895449 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.894150 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa447df9-716a-47c2-9ffb-b819a566f787-tmp-dir\") pod \"node-resolver-4vjjz\" (UID: \"fa447df9-716a-47c2-9ffb-b819a566f787\") " pod="openshift-dns/node-resolver-4vjjz" Apr 17 17:07:28.895859 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.895837 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d848a18d-f010-4ec0-898d-c9d149265ab6-ovnkube-config\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.898178 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.896080 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d848a18d-f010-4ec0-898d-c9d149265ab6-ovn-node-metrics-cert\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.898178 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.896262 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/31497e40-1e80-4723-9e91-a3e5cfedce92-agent-certs\") pod \"konnectivity-agent-n49j7\" (UID: \"31497e40-1e80-4723-9e91-a3e5cfedce92\") " pod="kube-system/konnectivity-agent-n49j7" Apr 17 17:07:28.898909 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:28.898842 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:07:28.898909 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:28.898872 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:07:28.898909 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:28.898886 2568 projected.go:194] Error preparing data for projected volume kube-api-access-ntjc2 for pod openshift-network-diagnostics/network-check-target-z2wnp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:28.899237 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:28.898975 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2 podName:8e113e09-f336-4b95-a2e1-db3f043106af nodeName:}" failed. No retries permitted until 2026-04-17 17:07:29.398955867 +0000 UTC m=+3.015089125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ntjc2" (UniqueName: "kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2") pod "network-check-target-z2wnp" (UID: "8e113e09-f336-4b95-a2e1-db3f043106af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:28.899795 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.899654 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95t9c\" (UniqueName: \"kubernetes.io/projected/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-kube-api-access-95t9c\") pod \"network-metrics-daemon-4xcb9\" (UID: \"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89\") " pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:28.902141 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.902118 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqmbv\" (UniqueName: \"kubernetes.io/projected/fa447df9-716a-47c2-9ffb-b819a566f787-kube-api-access-wqmbv\") pod \"node-resolver-4vjjz\" (UID: \"fa447df9-716a-47c2-9ffb-b819a566f787\") " pod="openshift-dns/node-resolver-4vjjz" Apr 17 17:07:28.902606 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.902558 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs5nf\" (UniqueName: \"kubernetes.io/projected/8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2-kube-api-access-qs5nf\") pod \"multus-additional-cni-plugins-b66zf\" (UID: \"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2\") " pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:28.903518 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.903474 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ghlf\" (UniqueName: \"kubernetes.io/projected/d848a18d-f010-4ec0-898d-c9d149265ab6-kube-api-access-2ghlf\") pod \"ovnkube-node-97rxf\" (UID: \"d848a18d-f010-4ec0-898d-c9d149265ab6\") " pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:28.903595 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.903572 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5csn\" (UniqueName: \"kubernetes.io/projected/3868d9bd-d06e-4b37-89c0-ab0c05df3fff-kube-api-access-l5csn\") pod \"node-ca-j5pdx\" (UID: \"3868d9bd-d06e-4b37-89c0-ab0c05df3fff\") " pod="openshift-image-registry/node-ca-j5pdx" Apr 17 17:07:28.904820 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.904739 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zs2b\" (UniqueName: \"kubernetes.io/projected/df7ed8d3-25de-4552-b8a6-d2602eedf81d-kube-api-access-5zs2b\") pod \"aws-ebs-csi-driver-node-7sqrq\" (UID: \"df7ed8d3-25de-4552-b8a6-d2602eedf81d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:28.995431 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995400 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-modprobe-d\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.995431 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995437 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqq48\" (UniqueName: \"kubernetes.io/projected/313ee430-9d15-42be-9216-97b917fe295e-kube-api-access-qqq48\") pod \"iptables-alerter-l8qbp\" (UID: \"313ee430-9d15-42be-9216-97b917fe295e\") " pod="openshift-network-operator/iptables-alerter-l8qbp" Apr 17 17:07:28.995670 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995461 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-var-lib-kubelet\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.995670 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995487 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3a952143-3965-4339-ba83-4a96e9b34841-cni-binary-copy\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.995670 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995512 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-lib-modules\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.995670 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995536 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-tmp\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.995670 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995556 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-modprobe-d\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.995670 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995563 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/313ee430-9d15-42be-9216-97b917fe295e-host-slash\") pod \"iptables-alerter-l8qbp\" (UID: \"313ee430-9d15-42be-9216-97b917fe295e\") " pod="openshift-network-operator/iptables-alerter-l8qbp" Apr 17 17:07:28.995670 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995597 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-var-lib-kubelet\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.995670 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995613 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-run\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.995670 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995656 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-run\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.996073 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995605 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/313ee430-9d15-42be-9216-97b917fe295e-host-slash\") pod \"iptables-alerter-l8qbp\" (UID: \"313ee430-9d15-42be-9216-97b917fe295e\") " pod="openshift-network-operator/iptables-alerter-l8qbp" Apr 17 17:07:28.996073 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995738 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-lib-modules\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.996073 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995786 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-tuned\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.996073 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995870 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-sysctl-conf\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.996073 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995903 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-system-cni-dir\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996073 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995929 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-multus-cni-dir\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996073 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995955 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-multus-socket-dir-parent\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996073 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.995981 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-run-k8s-cni-cncf-io\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996073 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996007 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-run-netns\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996073 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996009 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-sysctl-conf\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.996073 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996044 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-hostroot\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996073 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996052 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-run-netns\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996073 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996050 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-system-cni-dir\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996073 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996041 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-run-k8s-cni-cncf-io\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996093 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-hostroot\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996098 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-multus-socket-dir-parent\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996120 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-multus-cni-dir\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996132 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5jh5\" (UniqueName: \"kubernetes.io/projected/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-kube-api-access-w5jh5\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996159 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-etc-kubernetes\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996177 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3a952143-3965-4339-ba83-4a96e9b34841-cni-binary-copy\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996189 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6v5v\" (UniqueName: \"kubernetes.io/projected/3a952143-3965-4339-ba83-4a96e9b34841-kube-api-access-p6v5v\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996209 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-etc-kubernetes\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996229 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-sysconfig\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996272 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-kubernetes\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996319 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-systemd\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996368 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-kubernetes\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996377 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-systemd\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996380 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-sysconfig\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996392 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-run-multus-certs\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996434 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-var-lib-kubelet\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996439 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-run-multus-certs\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.996696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996461 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-sys\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996481 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-var-lib-kubelet\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996485 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-cnibin\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996525 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-sys\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996558 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-cnibin\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996558 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-os-release\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996603 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/313ee430-9d15-42be-9216-97b917fe295e-iptables-alerter-script\") pod \"iptables-alerter-l8qbp\" (UID: \"313ee430-9d15-42be-9216-97b917fe295e\") " pod="openshift-network-operator/iptables-alerter-l8qbp" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996613 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-os-release\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996628 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-host\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996659 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-var-lib-cni-bin\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996671 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-host\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996684 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-multus-conf-dir\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996697 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-var-lib-cni-bin\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996709 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3a952143-3965-4339-ba83-4a96e9b34841-multus-daemon-config\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996726 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-multus-conf-dir\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996742 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-sysctl-d\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996765 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-var-lib-cni-multus\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996824 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3a952143-3965-4339-ba83-4a96e9b34841-host-var-lib-cni-multus\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.997402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.996869 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-sysctl-d\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.998291 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.997092 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3a952143-3965-4339-ba83-4a96e9b34841-multus-daemon-config\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:28.998291 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.997649 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/313ee430-9d15-42be-9216-97b917fe295e-iptables-alerter-script\") pod \"iptables-alerter-l8qbp\" (UID: \"313ee430-9d15-42be-9216-97b917fe295e\") " pod="openshift-network-operator/iptables-alerter-l8qbp" Apr 17 17:07:28.998291 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.998024 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-etc-tuned\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:28.998291 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:28.998195 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-tmp\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:29.003683 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.003661 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqq48\" (UniqueName: \"kubernetes.io/projected/313ee430-9d15-42be-9216-97b917fe295e-kube-api-access-qqq48\") pod \"iptables-alerter-l8qbp\" (UID: \"313ee430-9d15-42be-9216-97b917fe295e\") " pod="openshift-network-operator/iptables-alerter-l8qbp" Apr 17 17:07:29.004037 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.004017 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5jh5\" (UniqueName: \"kubernetes.io/projected/19b47c63-cda0-44ea-bcad-5352d6fdeaa4-kube-api-access-w5jh5\") pod \"tuned-g6bld\" (UID: \"19b47c63-cda0-44ea-bcad-5352d6fdeaa4\") " pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:29.004122 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.004017 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6v5v\" (UniqueName: \"kubernetes.io/projected/3a952143-3965-4339-ba83-4a96e9b34841-kube-api-access-p6v5v\") pod \"multus-v68m9\" (UID: \"3a952143-3965-4339-ba83-4a96e9b34841\") " pod="openshift-multus/multus-v68m9" Apr 17 17:07:29.081012 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.080977 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j5pdx" Apr 17 17:07:29.094329 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.094293 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b66zf" Apr 17 17:07:29.103018 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.102996 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-n49j7" Apr 17 17:07:29.107584 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.107564 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" Apr 17 17:07:29.114175 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.114143 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4vjjz" Apr 17 17:07:29.120753 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.120733 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:29.129352 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.129333 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-l8qbp" Apr 17 17:07:29.136800 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.136781 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-g6bld" Apr 17 17:07:29.142371 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.142352 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v68m9" Apr 17 17:07:29.400103 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.400069 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntjc2\" (UniqueName: \"kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2\") pod \"network-check-target-z2wnp\" (UID: \"8e113e09-f336-4b95-a2e1-db3f043106af\") " pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:29.400278 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.400119 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs\") pod \"network-metrics-daemon-4xcb9\" (UID: \"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89\") " pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:29.400278 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:29.400217 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:29.400278 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:29.400232 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:07:29.400278 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:29.400267 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:07:29.400278 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:29.400272 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs podName:62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:30.400255962 +0000 UTC m=+4.016389202 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs") pod "network-metrics-daemon-4xcb9" (UID: "62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:29.400278 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:29.400278 2568 projected.go:194] Error preparing data for projected volume kube-api-access-ntjc2 for pod openshift-network-diagnostics/network-check-target-z2wnp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:29.400541 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:29.400344 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2 podName:8e113e09-f336-4b95-a2e1-db3f043106af nodeName:}" failed. No retries permitted until 2026-04-17 17:07:30.400326717 +0000 UTC m=+4.016459958 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ntjc2" (UniqueName: "kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2") pod "network-check-target-z2wnp" (UID: "8e113e09-f336-4b95-a2e1-db3f043106af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:29.563357 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:29.563329 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd848a18d_f010_4ec0_898d_c9d149265ab6.slice/crio-12e6ae4fd6d5c845b45c71cef25dec970c43c420b2beb1ae9bf0cbc2989c1454 WatchSource:0}: Error finding container 12e6ae4fd6d5c845b45c71cef25dec970c43c420b2beb1ae9bf0cbc2989c1454: Status 404 returned error can't find the container with id 12e6ae4fd6d5c845b45c71cef25dec970c43c420b2beb1ae9bf0cbc2989c1454 Apr 17 17:07:29.564448 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:29.564428 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19b47c63_cda0_44ea_bcad_5352d6fdeaa4.slice/crio-ac5bb29a26ce1958c2a371a8dcf4f9a10ee23737156550013df5e5d646fddd43 WatchSource:0}: Error finding container ac5bb29a26ce1958c2a371a8dcf4f9a10ee23737156550013df5e5d646fddd43: Status 404 returned error can't find the container with id ac5bb29a26ce1958c2a371a8dcf4f9a10ee23737156550013df5e5d646fddd43 Apr 17 17:07:29.566007 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:29.565914 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8052c1d7_4ccd_4aee_a77b_dc2cf3e75eb2.slice/crio-121f574b4ef77fbd22310bd6d6b7dda56b9f7f0a8d6a3b50e7f4a66b83ed3867 WatchSource:0}: Error finding container 121f574b4ef77fbd22310bd6d6b7dda56b9f7f0a8d6a3b50e7f4a66b83ed3867: Status 404 returned error can't find the container with id 121f574b4ef77fbd22310bd6d6b7dda56b9f7f0a8d6a3b50e7f4a66b83ed3867 Apr 17 17:07:29.566369 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:29.566349 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf7ed8d3_25de_4552_b8a6_d2602eedf81d.slice/crio-47055828f2a245178f31b8f89462ded5a009ef3fadccd3541dd8bedecfd3fc25 WatchSource:0}: Error finding container 47055828f2a245178f31b8f89462ded5a009ef3fadccd3541dd8bedecfd3fc25: Status 404 returned error can't find the container with id 47055828f2a245178f31b8f89462ded5a009ef3fadccd3541dd8bedecfd3fc25 Apr 17 17:07:29.569498 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:29.569478 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31497e40_1e80_4723_9e91_a3e5cfedce92.slice/crio-282b9af325e32ac307930696fce41535bef2ee5ef9728ce353eb56ec328fa119 WatchSource:0}: Error finding container 282b9af325e32ac307930696fce41535bef2ee5ef9728ce353eb56ec328fa119: Status 404 returned error can't find the container with id 282b9af325e32ac307930696fce41535bef2ee5ef9728ce353eb56ec328fa119 Apr 17 17:07:29.570507 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:29.570484 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa447df9_716a_47c2_9ffb_b819a566f787.slice/crio-755dbecec71e286b160fd90692b9c747b95c02519c72e47dc142b466aa1ac343 WatchSource:0}: Error finding container 755dbecec71e286b160fd90692b9c747b95c02519c72e47dc142b466aa1ac343: Status 404 returned error can't find the container with id 755dbecec71e286b160fd90692b9c747b95c02519c72e47dc142b466aa1ac343 Apr 17 17:07:29.571359 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:29.571295 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod313ee430_9d15_42be_9216_97b917fe295e.slice/crio-1e8d40689da8bde35a6f25380cf16242e9548c94a3e79890dcbaf703f9ef2424 WatchSource:0}: Error finding container 1e8d40689da8bde35a6f25380cf16242e9548c94a3e79890dcbaf703f9ef2424: Status 404 returned error can't find the container with id 1e8d40689da8bde35a6f25380cf16242e9548c94a3e79890dcbaf703f9ef2424 Apr 17 17:07:29.572775 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:29.572392 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a952143_3965_4339_ba83_4a96e9b34841.slice/crio-0069ecf277b8e88223c6b87c8befa59f11022b9804b4025a301bf1ab84bd0d67 WatchSource:0}: Error finding container 0069ecf277b8e88223c6b87c8befa59f11022b9804b4025a301bf1ab84bd0d67: Status 404 returned error can't find the container with id 0069ecf277b8e88223c6b87c8befa59f11022b9804b4025a301bf1ab84bd0d67 Apr 17 17:07:29.574507 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:07:29.573682 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3868d9bd_d06e_4b37_89c0_ab0c05df3fff.slice/crio-0553c96ba370248ce1ca91242d89d3adb52aa2e90f4340f630f8b102688dd9c4 WatchSource:0}: Error finding container 0553c96ba370248ce1ca91242d89d3adb52aa2e90f4340f630f8b102688dd9c4: Status 404 returned error can't find the container with id 0553c96ba370248ce1ca91242d89d3adb52aa2e90f4340f630f8b102688dd9c4 Apr 17 17:07:29.818046 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.817835 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:02:27 +0000 UTC" deadline="2027-10-26 19:16:12.89349065 +0000 UTC" Apr 17 17:07:29.818046 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.818037 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13370h8m43.075458003s" Apr 17 17:07:29.901438 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.901411 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:29.901574 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:29.901527 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wnp" podUID="8e113e09-f336-4b95-a2e1-db3f043106af" Apr 17 17:07:29.907463 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.907428 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v68m9" event={"ID":"3a952143-3965-4339-ba83-4a96e9b34841","Type":"ContainerStarted","Data":"0069ecf277b8e88223c6b87c8befa59f11022b9804b4025a301bf1ab84bd0d67"} Apr 17 17:07:29.908334 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.908314 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-l8qbp" event={"ID":"313ee430-9d15-42be-9216-97b917fe295e","Type":"ContainerStarted","Data":"1e8d40689da8bde35a6f25380cf16242e9548c94a3e79890dcbaf703f9ef2424"} Apr 17 17:07:29.909209 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.909185 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4vjjz" event={"ID":"fa447df9-716a-47c2-9ffb-b819a566f787","Type":"ContainerStarted","Data":"755dbecec71e286b160fd90692b9c747b95c02519c72e47dc142b466aa1ac343"} Apr 17 17:07:29.910046 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.910029 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-g6bld" event={"ID":"19b47c63-cda0-44ea-bcad-5352d6fdeaa4","Type":"ContainerStarted","Data":"ac5bb29a26ce1958c2a371a8dcf4f9a10ee23737156550013df5e5d646fddd43"} Apr 17 17:07:29.910971 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.910942 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" event={"ID":"d848a18d-f010-4ec0-898d-c9d149265ab6","Type":"ContainerStarted","Data":"12e6ae4fd6d5c845b45c71cef25dec970c43c420b2beb1ae9bf0cbc2989c1454"} Apr 17 17:07:29.912314 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.912277 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" event={"ID":"009f099669c612b1a9a7e8809b1d3526","Type":"ContainerStarted","Data":"499e4c94dbc88543fc16967f7c36bad20cf515698c21528de1b7cd03d493c07f"} Apr 17 17:07:29.915615 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.915594 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j5pdx" event={"ID":"3868d9bd-d06e-4b37-89c0-ab0c05df3fff","Type":"ContainerStarted","Data":"0553c96ba370248ce1ca91242d89d3adb52aa2e90f4340f630f8b102688dd9c4"} Apr 17 17:07:29.916530 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.916506 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-n49j7" event={"ID":"31497e40-1e80-4723-9e91-a3e5cfedce92","Type":"ContainerStarted","Data":"282b9af325e32ac307930696fce41535bef2ee5ef9728ce353eb56ec328fa119"} Apr 17 17:07:29.917391 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.917365 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" event={"ID":"df7ed8d3-25de-4552-b8a6-d2602eedf81d","Type":"ContainerStarted","Data":"47055828f2a245178f31b8f89462ded5a009ef3fadccd3541dd8bedecfd3fc25"} Apr 17 17:07:29.918288 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.918266 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b66zf" event={"ID":"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2","Type":"ContainerStarted","Data":"121f574b4ef77fbd22310bd6d6b7dda56b9f7f0a8d6a3b50e7f4a66b83ed3867"} Apr 17 17:07:29.925079 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:29.925043 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" podStartSLOduration=1.92503238 podStartE2EDuration="1.92503238s" podCreationTimestamp="2026-04-17 17:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:07:29.924736498 +0000 UTC m=+3.540869760" watchObservedRunningTime="2026-04-17 17:07:29.92503238 +0000 UTC m=+3.541165643" Apr 17 17:07:30.408252 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:30.408158 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntjc2\" (UniqueName: \"kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2\") pod \"network-check-target-z2wnp\" (UID: \"8e113e09-f336-4b95-a2e1-db3f043106af\") " pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:30.408252 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:30.408218 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs\") pod \"network-metrics-daemon-4xcb9\" (UID: \"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89\") " pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:30.408480 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:30.408380 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:30.408480 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:30.408440 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs podName:62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:32.408422172 +0000 UTC m=+6.024555422 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs") pod "network-metrics-daemon-4xcb9" (UID: "62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:30.409330 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:30.408842 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:07:30.409330 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:30.408861 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:07:30.409330 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:30.408873 2568 projected.go:194] Error preparing data for projected volume kube-api-access-ntjc2 for pod openshift-network-diagnostics/network-check-target-z2wnp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:30.409330 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:30.408918 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2 podName:8e113e09-f336-4b95-a2e1-db3f043106af nodeName:}" failed. No retries permitted until 2026-04-17 17:07:32.408903794 +0000 UTC m=+6.025037040 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ntjc2" (UniqueName: "kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2") pod "network-check-target-z2wnp" (UID: "8e113e09-f336-4b95-a2e1-db3f043106af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:30.903662 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:30.903632 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:30.904159 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:30.903760 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xcb9" podUID="62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89" Apr 17 17:07:30.939450 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:30.938737 2568 generic.go:358] "Generic (PLEG): container finished" podID="1c6e032a9cd3412b502d428b8f5c545c" containerID="a393eb989171f785b442d2be250b6830990f93746be6ebf3656c6487eccd22ba" exitCode=0 Apr 17 17:07:30.939450 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:30.939362 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" event={"ID":"1c6e032a9cd3412b502d428b8f5c545c","Type":"ContainerDied","Data":"a393eb989171f785b442d2be250b6830990f93746be6ebf3656c6487eccd22ba"} Apr 17 17:07:31.901248 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:31.901220 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:31.901417 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:31.901361 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wnp" podUID="8e113e09-f336-4b95-a2e1-db3f043106af" Apr 17 17:07:31.950319 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:31.950166 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" event={"ID":"1c6e032a9cd3412b502d428b8f5c545c","Type":"ContainerStarted","Data":"5c5a3a3bbf1907c005a476549d783402f586e2493c231ff35ba43a85ce04c030"} Apr 17 17:07:31.965141 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:31.964425 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" podStartSLOduration=3.964405891 podStartE2EDuration="3.964405891s" podCreationTimestamp="2026-04-17 17:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:07:31.963791008 +0000 UTC m=+5.579924272" watchObservedRunningTime="2026-04-17 17:07:31.964405891 +0000 UTC m=+5.580539154" Apr 17 17:07:32.424225 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:32.424189 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntjc2\" (UniqueName: \"kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2\") pod \"network-check-target-z2wnp\" (UID: \"8e113e09-f336-4b95-a2e1-db3f043106af\") " pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:32.424416 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:32.424240 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs\") pod \"network-metrics-daemon-4xcb9\" (UID: \"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89\") " pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:32.424416 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:32.424373 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:07:32.424416 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:32.424387 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:32.424416 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:32.424399 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:07:32.424416 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:32.424414 2568 projected.go:194] Error preparing data for projected volume kube-api-access-ntjc2 for pod openshift-network-diagnostics/network-check-target-z2wnp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:32.424699 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:32.424455 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs podName:62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:36.424436386 +0000 UTC m=+10.040569631 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs") pod "network-metrics-daemon-4xcb9" (UID: "62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:32.424699 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:32.424473 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2 podName:8e113e09-f336-4b95-a2e1-db3f043106af nodeName:}" failed. No retries permitted until 2026-04-17 17:07:36.424463997 +0000 UTC m=+10.040597239 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ntjc2" (UniqueName: "kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2") pod "network-check-target-z2wnp" (UID: "8e113e09-f336-4b95-a2e1-db3f043106af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:32.901207 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:32.901177 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:32.901410 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:32.901369 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xcb9" podUID="62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89" Apr 17 17:07:33.901123 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:33.901089 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:33.901595 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:33.901216 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wnp" podUID="8e113e09-f336-4b95-a2e1-db3f043106af" Apr 17 17:07:34.901134 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:34.901101 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:34.901601 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:34.901250 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xcb9" podUID="62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89" Apr 17 17:07:35.900769 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:35.900722 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:35.900928 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:35.900859 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wnp" podUID="8e113e09-f336-4b95-a2e1-db3f043106af" Apr 17 17:07:36.456655 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:36.456611 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntjc2\" (UniqueName: \"kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2\") pod \"network-check-target-z2wnp\" (UID: \"8e113e09-f336-4b95-a2e1-db3f043106af\") " pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:36.457116 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:36.456664 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs\") pod \"network-metrics-daemon-4xcb9\" (UID: \"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89\") " pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:36.457116 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:36.456784 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:36.457116 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:36.456847 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs podName:62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:44.456826369 +0000 UTC m=+18.072959613 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs") pod "network-metrics-daemon-4xcb9" (UID: "62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:36.457116 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:36.456943 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:07:36.457116 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:36.456960 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:07:36.457116 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:36.456973 2568 projected.go:194] Error preparing data for projected volume kube-api-access-ntjc2 for pod openshift-network-diagnostics/network-check-target-z2wnp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:36.457116 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:36.457011 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2 podName:8e113e09-f336-4b95-a2e1-db3f043106af nodeName:}" failed. No retries permitted until 2026-04-17 17:07:44.456999597 +0000 UTC m=+18.073132845 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ntjc2" (UniqueName: "kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2") pod "network-check-target-z2wnp" (UID: "8e113e09-f336-4b95-a2e1-db3f043106af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:36.902904 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:36.902439 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:36.902904 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:36.902562 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xcb9" podUID="62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89" Apr 17 17:07:37.900829 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:37.900786 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:37.901265 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:37.900912 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wnp" podUID="8e113e09-f336-4b95-a2e1-db3f043106af" Apr 17 17:07:38.901683 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:38.901650 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:38.902153 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:38.901806 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xcb9" podUID="62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89" Apr 17 17:07:39.901023 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:39.900993 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:39.901300 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:39.901093 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wnp" podUID="8e113e09-f336-4b95-a2e1-db3f043106af" Apr 17 17:07:40.901649 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:40.901618 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:40.902098 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:40.901742 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xcb9" podUID="62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89" Apr 17 17:07:41.901592 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:41.901560 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:41.901743 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:41.901654 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wnp" podUID="8e113e09-f336-4b95-a2e1-db3f043106af" Apr 17 17:07:42.663187 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:42.663160 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gfbn9"] Apr 17 17:07:42.667464 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:42.667439 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:42.667588 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:42.667508 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gfbn9" podUID="fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0" Apr 17 17:07:42.701283 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:42.701251 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret\") pod \"global-pull-secret-syncer-gfbn9\" (UID: \"fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0\") " pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:42.701439 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:42.701317 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-dbus\") pod \"global-pull-secret-syncer-gfbn9\" (UID: \"fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0\") " pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:42.701439 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:42.701374 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-kubelet-config\") pod \"global-pull-secret-syncer-gfbn9\" (UID: \"fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0\") " pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:42.802211 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:42.802171 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret\") pod \"global-pull-secret-syncer-gfbn9\" (UID: \"fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0\") " pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:42.802412 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:42.802232 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-dbus\") pod \"global-pull-secret-syncer-gfbn9\" (UID: \"fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0\") " pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:42.802412 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:42.802278 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-kubelet-config\") pod \"global-pull-secret-syncer-gfbn9\" (UID: \"fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0\") " pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:42.802412 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:42.802356 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:42.802412 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:42.802373 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-kubelet-config\") pod \"global-pull-secret-syncer-gfbn9\" (UID: \"fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0\") " pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:42.802614 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:42.802435 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret podName:fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:43.302413807 +0000 UTC m=+16.918547048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret") pod "global-pull-secret-syncer-gfbn9" (UID: "fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:42.802614 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:42.802449 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-dbus\") pod \"global-pull-secret-syncer-gfbn9\" (UID: \"fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0\") " pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:42.901181 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:42.901140 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:42.901363 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:42.901284 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xcb9" podUID="62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89" Apr 17 17:07:43.307798 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:43.307759 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret\") pod \"global-pull-secret-syncer-gfbn9\" (UID: \"fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0\") " pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:43.308214 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:43.307934 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:43.308214 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:43.308006 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret podName:fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:44.307987265 +0000 UTC m=+17.924120513 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret") pod "global-pull-secret-syncer-gfbn9" (UID: "fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:43.901099 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:43.901020 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:43.901266 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:43.901020 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:43.901266 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:43.901145 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wnp" podUID="8e113e09-f336-4b95-a2e1-db3f043106af" Apr 17 17:07:43.901266 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:43.901250 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gfbn9" podUID="fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0" Apr 17 17:07:44.317818 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:44.317773 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret\") pod \"global-pull-secret-syncer-gfbn9\" (UID: \"fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0\") " pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:44.318263 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:44.317905 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:44.318263 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:44.317984 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret podName:fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:46.317963041 +0000 UTC m=+19.934096281 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret") pod "global-pull-secret-syncer-gfbn9" (UID: "fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:44.519768 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:44.519727 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntjc2\" (UniqueName: \"kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2\") pod \"network-check-target-z2wnp\" (UID: \"8e113e09-f336-4b95-a2e1-db3f043106af\") " pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:44.519768 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:44.519769 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs\") pod \"network-metrics-daemon-4xcb9\" (UID: \"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89\") " pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:44.520003 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:44.519886 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:44.520003 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:44.519904 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:07:44.520003 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:44.519929 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:07:44.520003 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:44.519940 2568 projected.go:194] Error preparing data for projected volume kube-api-access-ntjc2 for pod openshift-network-diagnostics/network-check-target-z2wnp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:44.520003 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:44.519946 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs podName:62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:00.519929774 +0000 UTC m=+34.136063014 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs") pod "network-metrics-daemon-4xcb9" (UID: "62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:44.520003 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:44.519997 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2 podName:8e113e09-f336-4b95-a2e1-db3f043106af nodeName:}" failed. No retries permitted until 2026-04-17 17:08:00.519980787 +0000 UTC m=+34.136114034 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ntjc2" (UniqueName: "kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2") pod "network-check-target-z2wnp" (UID: "8e113e09-f336-4b95-a2e1-db3f043106af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:44.901100 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:44.901067 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:44.901265 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:44.901189 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xcb9" podUID="62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89" Apr 17 17:07:45.901364 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:45.901326 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:45.901793 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:45.901326 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:45.901793 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:45.901457 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wnp" podUID="8e113e09-f336-4b95-a2e1-db3f043106af" Apr 17 17:07:45.901793 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:45.901519 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gfbn9" podUID="fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0" Apr 17 17:07:46.333457 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:46.333430 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret\") pod \"global-pull-secret-syncer-gfbn9\" (UID: \"fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0\") " pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:46.333584 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:46.333564 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:46.333632 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:46.333624 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret podName:fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:50.333609825 +0000 UTC m=+23.949743065 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret") pod "global-pull-secret-syncer-gfbn9" (UID: "fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:46.901613 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:46.901474 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:46.902126 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:46.901692 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xcb9" podUID="62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89" Apr 17 17:07:46.978502 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:46.978465 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-n49j7" event={"ID":"31497e40-1e80-4723-9e91-a3e5cfedce92","Type":"ContainerStarted","Data":"4267703f60377382b63a11846c9ca7482232b5f433992aad66a69f987b7af2b8"} Apr 17 17:07:46.979848 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:46.979825 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" event={"ID":"df7ed8d3-25de-4552-b8a6-d2602eedf81d","Type":"ContainerStarted","Data":"08f91e4589962c3eae3e8ee51a32547d7c3ff7c85d7b230e77007c549ad52f68"} Apr 17 17:07:46.981277 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:46.981252 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b66zf" event={"ID":"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2","Type":"ContainerStarted","Data":"c77cd97dfc3fe750c8fcf2483f65deabfcf68d3fa391822e9cbcbaa39c598865"} Apr 17 17:07:46.982859 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:46.982835 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v68m9" event={"ID":"3a952143-3965-4339-ba83-4a96e9b34841","Type":"ContainerStarted","Data":"eb9539a9ae6f870baa1b4a0ed5596508f72e0d03d2370d2945d3125e3a1bbd83"} Apr 17 17:07:46.983976 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:46.983958 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4vjjz" event={"ID":"fa447df9-716a-47c2-9ffb-b819a566f787","Type":"ContainerStarted","Data":"0c935f02ebf6c4ce44e9bb682cb1fb69841c4c0505cc2199c8c26096318cbd77"} Apr 17 17:07:46.985249 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:46.985228 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-g6bld" event={"ID":"19b47c63-cda0-44ea-bcad-5352d6fdeaa4","Type":"ContainerStarted","Data":"2d5fa48ab7ec6e3bd3e7baf66e401563263ea467417f47f6c622c01fd2f56547"} Apr 17 17:07:46.987556 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:46.987527 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" event={"ID":"d848a18d-f010-4ec0-898d-c9d149265ab6","Type":"ContainerStarted","Data":"a03f71c27d73fb1f54b3a7f7aeac91be6898213b7a0fb2ea79a758cf716ec9be"} Apr 17 17:07:46.987632 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:46.987564 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" event={"ID":"d848a18d-f010-4ec0-898d-c9d149265ab6","Type":"ContainerStarted","Data":"75e5572485ea960d8064e69c1157c6a9f265f060e2749e70c70b1eba30437f60"} Apr 17 17:07:46.989215 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:46.989176 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j5pdx" event={"ID":"3868d9bd-d06e-4b37-89c0-ab0c05df3fff","Type":"ContainerStarted","Data":"303092446fdc57ae48a78739e25d8183dd5d376678320c1a7c24843a793374a6"} Apr 17 17:07:47.016709 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:47.016498 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-n49j7" podStartSLOduration=11.10022129 podStartE2EDuration="20.01647777s" podCreationTimestamp="2026-04-17 17:07:27 +0000 UTC" firstStartedPulling="2026-04-17 17:07:29.571439842 +0000 UTC m=+3.187573085" lastFinishedPulling="2026-04-17 17:07:38.48769632 +0000 UTC m=+12.103829565" observedRunningTime="2026-04-17 17:07:46.996492927 +0000 UTC m=+20.612626191" watchObservedRunningTime="2026-04-17 17:07:47.01647777 +0000 UTC m=+20.632611034" Apr 17 17:07:47.047363 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:47.047280 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4vjjz" podStartSLOduration=3.177904723 podStartE2EDuration="20.047265572s" podCreationTimestamp="2026-04-17 17:07:27 +0000 UTC" firstStartedPulling="2026-04-17 17:07:29.572742778 +0000 UTC m=+3.188876021" lastFinishedPulling="2026-04-17 17:07:46.442103619 +0000 UTC m=+20.058236870" observedRunningTime="2026-04-17 17:07:47.019637719 +0000 UTC m=+20.635770982" watchObservedRunningTime="2026-04-17 17:07:47.047265572 +0000 UTC m=+20.663398845" Apr 17 17:07:47.077718 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:47.077677 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-j5pdx" podStartSLOduration=4.18732917 podStartE2EDuration="21.077662654s" podCreationTimestamp="2026-04-17 17:07:26 +0000 UTC" firstStartedPulling="2026-04-17 17:07:29.5760963 +0000 UTC m=+3.192229541" lastFinishedPulling="2026-04-17 17:07:46.46642977 +0000 UTC m=+20.082563025" observedRunningTime="2026-04-17 17:07:47.050695945 +0000 UTC m=+20.666829208" watchObservedRunningTime="2026-04-17 17:07:47.077662654 +0000 UTC m=+20.693795916" Apr 17 17:07:47.077901 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:47.077880 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-g6bld" podStartSLOduration=3.154845039 podStartE2EDuration="20.077875952s" podCreationTimestamp="2026-04-17 17:07:27 +0000 UTC" firstStartedPulling="2026-04-17 17:07:29.56692379 +0000 UTC m=+3.183057032" lastFinishedPulling="2026-04-17 17:07:46.489954704 +0000 UTC m=+20.106087945" observedRunningTime="2026-04-17 17:07:47.077493261 +0000 UTC m=+20.693626527" watchObservedRunningTime="2026-04-17 17:07:47.077875952 +0000 UTC m=+20.694009214" Apr 17 17:07:47.108453 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:47.108415 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-v68m9" podStartSLOduration=3.179656673 podStartE2EDuration="20.108401459s" podCreationTimestamp="2026-04-17 17:07:27 +0000 UTC" firstStartedPulling="2026-04-17 17:07:29.574765948 +0000 UTC m=+3.190899189" lastFinishedPulling="2026-04-17 17:07:46.503510735 +0000 UTC m=+20.119643975" observedRunningTime="2026-04-17 17:07:47.107944945 +0000 UTC m=+20.724078207" watchObservedRunningTime="2026-04-17 17:07:47.108401459 +0000 UTC m=+20.724534722" Apr 17 17:07:47.849049 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:47.849033 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:07:47.900739 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:47.900719 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:47.900822 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:47.900719 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:47.900822 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:47.900809 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gfbn9" podUID="fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0" Apr 17 17:07:47.900901 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:47.900884 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wnp" podUID="8e113e09-f336-4b95-a2e1-db3f043106af" Apr 17 17:07:47.992343 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:47.992274 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" event={"ID":"df7ed8d3-25de-4552-b8a6-d2602eedf81d","Type":"ContainerStarted","Data":"68685fed894a94909a496630757a7ef7ef01141118e1078cea6ace556c32faba"} Apr 17 17:07:47.993693 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:47.993667 2568 generic.go:358] "Generic (PLEG): container finished" podID="8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2" containerID="c77cd97dfc3fe750c8fcf2483f65deabfcf68d3fa391822e9cbcbaa39c598865" exitCode=0 Apr 17 17:07:47.994320 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:47.993698 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b66zf" event={"ID":"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2","Type":"ContainerDied","Data":"c77cd97dfc3fe750c8fcf2483f65deabfcf68d3fa391822e9cbcbaa39c598865"} Apr 17 17:07:47.997897 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:47.997875 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-l8qbp" event={"ID":"313ee430-9d15-42be-9216-97b917fe295e","Type":"ContainerStarted","Data":"1bce14ceaf1451d3588d7afe39c04564865bb1343b2a0edfc2687ab151a5c939"} Apr 17 17:07:48.000356 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:48.000339 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:07:48.002010 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:48.001968 2568 generic.go:358] "Generic (PLEG): container finished" podID="d848a18d-f010-4ec0-898d-c9d149265ab6" containerID="a03f71c27d73fb1f54b3a7f7aeac91be6898213b7a0fb2ea79a758cf716ec9be" exitCode=1 Apr 17 17:07:48.002010 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:48.001998 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" event={"ID":"d848a18d-f010-4ec0-898d-c9d149265ab6","Type":"ContainerDied","Data":"a03f71c27d73fb1f54b3a7f7aeac91be6898213b7a0fb2ea79a758cf716ec9be"} Apr 17 17:07:48.002157 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:48.002024 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" event={"ID":"d848a18d-f010-4ec0-898d-c9d149265ab6","Type":"ContainerStarted","Data":"d86fd8174aace6c5dbb04377e01772deb11541142c3282f1a434a362b094730b"} Apr 17 17:07:48.002157 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:48.002038 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" event={"ID":"d848a18d-f010-4ec0-898d-c9d149265ab6","Type":"ContainerStarted","Data":"bbd89af994e13d2a06d14df53d9219ee2c988fca2f9733653bdb273505f92dd1"} Apr 17 17:07:48.002157 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:48.002047 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" event={"ID":"d848a18d-f010-4ec0-898d-c9d149265ab6","Type":"ContainerStarted","Data":"e07513b34f9a891b5a9f62c771db3d7c989324d474a7cbcbf94b36f7d0253314"} Apr 17 17:07:48.002157 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:48.002056 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" event={"ID":"d848a18d-f010-4ec0-898d-c9d149265ab6","Type":"ContainerStarted","Data":"22e293a2898194ee5b66711c750b5f2ff61dd06e1dc34b292d745d8a7a3928b9"} Apr 17 17:07:48.408535 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:48.408504 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-n49j7" Apr 17 17:07:48.408966 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:48.408954 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-n49j7" Apr 17 17:07:48.422359 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:48.422285 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-l8qbp" podStartSLOduration=4.508589889 podStartE2EDuration="21.422272694s" podCreationTimestamp="2026-04-17 17:07:27 +0000 UTC" firstStartedPulling="2026-04-17 17:07:29.573956885 +0000 UTC m=+3.190090131" lastFinishedPulling="2026-04-17 17:07:46.487639682 +0000 UTC m=+20.103772936" observedRunningTime="2026-04-17 17:07:48.024104952 +0000 UTC m=+21.640238214" watchObservedRunningTime="2026-04-17 17:07:48.422272694 +0000 UTC m=+22.038405957" Apr 17 17:07:48.849771 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:48.849656 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:07:47.849046372Z","UUID":"7c72e661-1c3a-49e8-baae-21c50976387e","Handler":null,"Name":"","Endpoint":""} Apr 17 17:07:48.852484 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:48.852123 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:07:48.852484 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:48.852154 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:07:48.901362 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:48.901335 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:48.901505 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:48.901429 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xcb9" podUID="62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89" Apr 17 17:07:49.003881 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:49.003852 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-n49j7" Apr 17 17:07:49.004296 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:49.004248 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-n49j7" Apr 17 17:07:49.901571 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:49.901404 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:49.901718 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:49.901404 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:49.901781 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:49.901746 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gfbn9" podUID="fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0" Apr 17 17:07:49.901823 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:49.901639 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wnp" podUID="8e113e09-f336-4b95-a2e1-db3f043106af" Apr 17 17:07:50.007582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:50.007544 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" event={"ID":"df7ed8d3-25de-4552-b8a6-d2602eedf81d","Type":"ContainerStarted","Data":"698510899d1a7ec4bfc0b66c6801d467b093aba16586776c056497facfaac3d7"} Apr 17 17:07:50.010663 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:50.010635 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:07:50.012413 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:50.012379 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" event={"ID":"d848a18d-f010-4ec0-898d-c9d149265ab6","Type":"ContainerStarted","Data":"841869af4e63b7fd46e759cb9a69ca87c1a8253e53d62fd56160ca957b75cb9c"} Apr 17 17:07:50.027623 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:50.027582 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7sqrq" podStartSLOduration=3.313987345 podStartE2EDuration="23.027564213s" podCreationTimestamp="2026-04-17 17:07:27 +0000 UTC" firstStartedPulling="2026-04-17 17:07:29.568895481 +0000 UTC m=+3.185028724" lastFinishedPulling="2026-04-17 17:07:49.282472349 +0000 UTC m=+22.898605592" observedRunningTime="2026-04-17 17:07:50.026987586 +0000 UTC m=+23.643120891" watchObservedRunningTime="2026-04-17 17:07:50.027564213 +0000 UTC m=+23.643697478" Apr 17 17:07:50.366058 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:50.366023 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret\") pod \"global-pull-secret-syncer-gfbn9\" (UID: \"fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0\") " pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:50.366200 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:50.366134 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:50.366200 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:50.366188 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret podName:fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:58.366174383 +0000 UTC m=+31.982307624 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret") pod "global-pull-secret-syncer-gfbn9" (UID: "fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:50.900850 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:50.900817 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:50.901044 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:50.900959 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xcb9" podUID="62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89" Apr 17 17:07:51.901105 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:51.901025 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:51.901773 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:51.901142 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wnp" podUID="8e113e09-f336-4b95-a2e1-db3f043106af" Apr 17 17:07:51.901773 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:51.901173 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:51.901773 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:51.901225 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gfbn9" podUID="fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0" Apr 17 17:07:52.900678 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:52.900519 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:52.900777 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:52.900757 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xcb9" podUID="62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89" Apr 17 17:07:53.019451 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:53.019428 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:07:53.019897 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:53.019756 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" event={"ID":"d848a18d-f010-4ec0-898d-c9d149265ab6","Type":"ContainerStarted","Data":"986bc2da9a8e808f6e36bad9b5b2371475c092f28e4f1c2559e10c33d96b6724"} Apr 17 17:07:53.020075 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:53.020058 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:53.020164 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:53.020083 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:53.020289 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:53.020276 2568 scope.go:117] "RemoveContainer" containerID="a03f71c27d73fb1f54b3a7f7aeac91be6898213b7a0fb2ea79a758cf716ec9be" Apr 17 17:07:53.042887 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:53.042867 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:53.901382 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:53.901342 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:53.901596 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:53.901342 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:53.901596 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:53.901436 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gfbn9" podUID="fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0" Apr 17 17:07:53.901596 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:53.901521 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wnp" podUID="8e113e09-f336-4b95-a2e1-db3f043106af" Apr 17 17:07:54.025953 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:54.025927 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:07:54.026420 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:54.026397 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" event={"ID":"d848a18d-f010-4ec0-898d-c9d149265ab6","Type":"ContainerStarted","Data":"2f72eedbf21efdb0666bf6dbfd7d441087b1243f19015d5958c0b0c7218c778e"} Apr 17 17:07:54.026783 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:54.026756 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:54.028247 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:54.028220 2568 generic.go:358] "Generic (PLEG): container finished" podID="8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2" containerID="dc3bc362ad59df307b21fec084d674cfae4c9ca8f80e2804961c5dd8a68cdebc" exitCode=0 Apr 17 17:07:54.028380 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:54.028255 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b66zf" event={"ID":"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2","Type":"ContainerDied","Data":"dc3bc362ad59df307b21fec084d674cfae4c9ca8f80e2804961c5dd8a68cdebc"} Apr 17 17:07:54.042242 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:54.042224 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:07:54.058877 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:54.058835 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" podStartSLOduration=9.866925261 podStartE2EDuration="27.058822485s" podCreationTimestamp="2026-04-17 17:07:27 +0000 UTC" firstStartedPulling="2026-04-17 17:07:29.565485849 +0000 UTC m=+3.181619095" lastFinishedPulling="2026-04-17 17:07:46.75738306 +0000 UTC m=+20.373516319" observedRunningTime="2026-04-17 17:07:54.058323866 +0000 UTC m=+27.674457129" watchObservedRunningTime="2026-04-17 17:07:54.058822485 +0000 UTC m=+27.674955748" Apr 17 17:07:54.743646 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:54.743614 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gfbn9"] Apr 17 17:07:54.743859 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:54.743727 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:54.743859 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:54.743834 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gfbn9" podUID="fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0" Apr 17 17:07:54.747602 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:54.747573 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4xcb9"] Apr 17 17:07:54.747722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:54.747699 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:54.747833 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:54.747811 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xcb9" podUID="62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89" Apr 17 17:07:54.748202 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:54.748185 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z2wnp"] Apr 17 17:07:54.748285 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:54.748259 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:54.748533 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:54.748355 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wnp" podUID="8e113e09-f336-4b95-a2e1-db3f043106af" Apr 17 17:07:55.031457 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:55.031428 2568 generic.go:358] "Generic (PLEG): container finished" podID="8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2" containerID="4b55e7c72c6b7d6f6701c9a2e621432664fd44b1bff92e325990bc812fc4aaec" exitCode=0 Apr 17 17:07:55.031817 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:55.031522 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b66zf" event={"ID":"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2","Type":"ContainerDied","Data":"4b55e7c72c6b7d6f6701c9a2e621432664fd44b1bff92e325990bc812fc4aaec"} Apr 17 17:07:55.901367 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:55.901299 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:55.901367 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:55.901299 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:55.901584 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:55.901408 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gfbn9" podUID="fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0" Apr 17 17:07:55.901584 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:55.901458 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xcb9" podUID="62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89" Apr 17 17:07:56.037576 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:56.037541 2568 generic.go:358] "Generic (PLEG): container finished" podID="8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2" containerID="7761a3add55a9342664544fbc844e42a8c95b693233bb4eebaf9ef05e8c8747e" exitCode=0 Apr 17 17:07:56.037996 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:56.037624 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b66zf" event={"ID":"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2","Type":"ContainerDied","Data":"7761a3add55a9342664544fbc844e42a8c95b693233bb4eebaf9ef05e8c8747e"} Apr 17 17:07:56.901865 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:56.901670 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:56.902032 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:56.901931 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wnp" podUID="8e113e09-f336-4b95-a2e1-db3f043106af" Apr 17 17:07:57.901535 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:57.901505 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:57.902106 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:57.901506 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:57.902106 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:57.901622 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gfbn9" podUID="fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0" Apr 17 17:07:57.902106 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:57.901710 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4xcb9" podUID="62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89" Apr 17 17:07:58.426570 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:58.426534 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret\") pod \"global-pull-secret-syncer-gfbn9\" (UID: \"fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0\") " pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:58.426744 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:58.426694 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:58.426810 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:58.426763 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret podName:fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:14.426747014 +0000 UTC m=+48.042880260 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret") pod "global-pull-secret-syncer-gfbn9" (UID: "fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:58.900657 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:58.900624 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:07:58.900822 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:07:58.900727 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z2wnp" podUID="8e113e09-f336-4b95-a2e1-db3f043106af" Apr 17 17:07:59.768011 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.767982 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeReady" Apr 17 17:07:59.768586 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.768140 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:07:59.802543 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.802508 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-595ccf57f7-xzv8n"] Apr 17 17:07:59.806592 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.806568 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5f797f5f96-zsrx2"] Apr 17 17:07:59.806766 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.806735 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-595ccf57f7-xzv8n" Apr 17 17:07:59.809403 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.809386 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx"] Apr 17 17:07:59.809561 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.809537 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:07:59.809668 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.809594 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 17:07:59.809842 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.809827 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 17:07:59.810083 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.810066 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-tgrb7\"" Apr 17 17:07:59.810158 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.810106 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 17:07:59.810158 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.810136 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 17:07:59.811914 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.811898 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd"] Apr 17 17:07:59.812059 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.812041 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" Apr 17 17:07:59.812569 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.812399 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 17:07:59.812569 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.812412 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 17:07:59.812569 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.812425 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-b4xpj\"" Apr 17 17:07:59.812569 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.812400 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 17:07:59.818094 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.818059 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:07:59.818197 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.818163 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 17:07:59.820010 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.819989 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 17:07:59.820252 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.820203 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-595ccf57f7-xzv8n"] Apr 17 17:07:59.821070 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.821047 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5f797f5f96-zsrx2"] Apr 17 17:07:59.821182 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.821166 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 17:07:59.821260 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.821244 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 17:07:59.821434 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.821408 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 17:07:59.821989 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.821969 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 17:07:59.831081 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.831044 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-whlxq"] Apr 17 17:07:59.833973 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.833954 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6b8st"] Apr 17 17:07:59.834080 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.834061 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-whlxq" Apr 17 17:07:59.837051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.836830 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:07:59.837051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.836845 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6b8st" Apr 17 17:07:59.837051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.836833 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:07:59.837256 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.837093 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j45n5\"" Apr 17 17:07:59.842042 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.842024 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:07:59.842139 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.842059 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:07:59.842139 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.842066 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:07:59.842379 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.842363 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qdqk7\"" Apr 17 17:07:59.847421 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.847400 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx"] Apr 17 17:07:59.848264 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.848246 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd"] Apr 17 17:07:59.866004 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.865894 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-whlxq"] Apr 17 17:07:59.901450 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.901422 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:07:59.901450 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.901444 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:07:59.904239 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.904219 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:07:59.904370 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.904249 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mrqrv\"" Apr 17 17:07:59.904370 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.904284 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:07:59.911819 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.911776 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6b8st"] Apr 17 17:07:59.940214 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940188 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dk4f\" (UniqueName: \"kubernetes.io/projected/76642cd9-fee5-4031-a695-f531ec9d1dcc-kube-api-access-9dk4f\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:07:59.940352 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940232 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-tmp-dir\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:07:59.940352 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940258 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69xkc\" (UniqueName: \"kubernetes.io/projected/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-kube-api-access-69xkc\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:07:59.940352 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940327 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk7dk\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-kube-api-access-hk7dk\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:07:59.940528 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940362 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/76642cd9-fee5-4031-a695-f531ec9d1dcc-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:07:59.940528 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940428 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e5ec3f84-dd05-4c76-b51e-66780aa5f059-klusterlet-config\") pod \"klusterlet-addon-workmgr-cfddc449d-9htbx\" (UID: \"e5ec3f84-dd05-4c76-b51e-66780aa5f059\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" Apr 17 17:07:59.940528 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940469 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:07:59.940528 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940493 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-certificates\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:07:59.940528 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940518 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-bound-sa-token\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:07:59.940730 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940544 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psl2q\" (UniqueName: \"kubernetes.io/projected/e5ec3f84-dd05-4c76-b51e-66780aa5f059-kube-api-access-psl2q\") pod \"klusterlet-addon-workmgr-cfddc449d-9htbx\" (UID: \"e5ec3f84-dd05-4c76-b51e-66780aa5f059\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" Apr 17 17:07:59.940730 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940601 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5ec3f84-dd05-4c76-b51e-66780aa5f059-tmp\") pod \"klusterlet-addon-workmgr-cfddc449d-9htbx\" (UID: \"e5ec3f84-dd05-4c76-b51e-66780aa5f059\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" Apr 17 17:07:59.940730 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940646 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/76642cd9-fee5-4031-a695-f531ec9d1dcc-ca\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:07:59.940730 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940687 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/138cec68-511a-467f-b5de-b2b7fc19fbb5-trusted-ca\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:07:59.940730 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940716 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-config-volume\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:07:59.940957 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940747 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwshl\" (UniqueName: \"kubernetes.io/projected/b183f9d4-46f5-4c01-84a9-342f742b9289-kube-api-access-xwshl\") pod \"managed-serviceaccount-addon-agent-595ccf57f7-xzv8n\" (UID: \"b183f9d4-46f5-4c01-84a9-342f742b9289\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-595ccf57f7-xzv8n" Apr 17 17:07:59.940957 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940766 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/138cec68-511a-467f-b5de-b2b7fc19fbb5-image-registry-private-configuration\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:07:59.940957 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940793 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/76642cd9-fee5-4031-a695-f531ec9d1dcc-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:07:59.940957 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940880 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b183f9d4-46f5-4c01-84a9-342f742b9289-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-595ccf57f7-xzv8n\" (UID: \"b183f9d4-46f5-4c01-84a9-342f742b9289\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-595ccf57f7-xzv8n" Apr 17 17:07:59.940957 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940922 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/76642cd9-fee5-4031-a695-f531ec9d1dcc-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:07:59.941149 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.940975 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/138cec68-511a-467f-b5de-b2b7fc19fbb5-ca-trust-extracted\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:07:59.941149 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.941005 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/138cec68-511a-467f-b5de-b2b7fc19fbb5-installation-pull-secrets\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:07:59.941149 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.941025 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert\") pod \"ingress-canary-whlxq\" (UID: \"da1e7104-4977-42cd-80d8-b8775e3a717b\") " pod="openshift-ingress-canary/ingress-canary-whlxq" Apr 17 17:07:59.941149 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.941046 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nzcx\" (UniqueName: \"kubernetes.io/projected/da1e7104-4977-42cd-80d8-b8775e3a717b-kube-api-access-4nzcx\") pod \"ingress-canary-whlxq\" (UID: \"da1e7104-4977-42cd-80d8-b8775e3a717b\") " pod="openshift-ingress-canary/ingress-canary-whlxq" Apr 17 17:07:59.941149 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.941069 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:07:59.941149 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:07:59.941089 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/76642cd9-fee5-4031-a695-f531ec9d1dcc-hub\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:08:00.041716 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.041632 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e5ec3f84-dd05-4c76-b51e-66780aa5f059-klusterlet-config\") pod \"klusterlet-addon-workmgr-cfddc449d-9htbx\" (UID: \"e5ec3f84-dd05-4c76-b51e-66780aa5f059\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" Apr 17 17:08:00.041716 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.041680 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:00.041716 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.041703 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-certificates\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:00.041989 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.041727 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-bound-sa-token\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:00.041989 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.041749 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psl2q\" (UniqueName: \"kubernetes.io/projected/e5ec3f84-dd05-4c76-b51e-66780aa5f059-kube-api-access-psl2q\") pod \"klusterlet-addon-workmgr-cfddc449d-9htbx\" (UID: \"e5ec3f84-dd05-4c76-b51e-66780aa5f059\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" Apr 17 17:08:00.041989 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.041782 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5ec3f84-dd05-4c76-b51e-66780aa5f059-tmp\") pod \"klusterlet-addon-workmgr-cfddc449d-9htbx\" (UID: \"e5ec3f84-dd05-4c76-b51e-66780aa5f059\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" Apr 17 17:08:00.041989 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.041785 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:08:00.041989 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.041807 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f797f5f96-zsrx2: secret "image-registry-tls" not found Apr 17 17:08:00.041989 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.041871 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls podName:138cec68-511a-467f-b5de-b2b7fc19fbb5 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:00.541849743 +0000 UTC m=+34.157983005 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls") pod "image-registry-5f797f5f96-zsrx2" (UID: "138cec68-511a-467f-b5de-b2b7fc19fbb5") : secret "image-registry-tls" not found Apr 17 17:08:00.041989 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.041803 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/76642cd9-fee5-4031-a695-f531ec9d1dcc-ca\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:08:00.041989 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.041980 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/138cec68-511a-467f-b5de-b2b7fc19fbb5-trusted-ca\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:00.042351 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.042010 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-config-volume\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:08:00.046330 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.042084 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwshl\" (UniqueName: \"kubernetes.io/projected/b183f9d4-46f5-4c01-84a9-342f742b9289-kube-api-access-xwshl\") pod \"managed-serviceaccount-addon-agent-595ccf57f7-xzv8n\" (UID: \"b183f9d4-46f5-4c01-84a9-342f742b9289\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-595ccf57f7-xzv8n" Apr 17 17:08:00.046330 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.042815 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/138cec68-511a-467f-b5de-b2b7fc19fbb5-image-registry-private-configuration\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:00.046330 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.042870 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/76642cd9-fee5-4031-a695-f531ec9d1dcc-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:08:00.046330 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.042912 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5ec3f84-dd05-4c76-b51e-66780aa5f059-tmp\") pod \"klusterlet-addon-workmgr-cfddc449d-9htbx\" (UID: \"e5ec3f84-dd05-4c76-b51e-66780aa5f059\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" Apr 17 17:08:00.046330 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.042951 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b183f9d4-46f5-4c01-84a9-342f742b9289-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-595ccf57f7-xzv8n\" (UID: \"b183f9d4-46f5-4c01-84a9-342f742b9289\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-595ccf57f7-xzv8n" Apr 17 17:08:00.046330 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.043510 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-certificates\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:00.046330 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.043532 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-config-volume\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:08:00.046330 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.043777 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/138cec68-511a-467f-b5de-b2b7fc19fbb5-trusted-ca\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:00.046330 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.044103 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/76642cd9-fee5-4031-a695-f531ec9d1dcc-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:08:00.046330 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.044233 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/76642cd9-fee5-4031-a695-f531ec9d1dcc-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:08:00.046330 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.044276 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/138cec68-511a-467f-b5de-b2b7fc19fbb5-ca-trust-extracted\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:00.046330 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.044333 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/138cec68-511a-467f-b5de-b2b7fc19fbb5-installation-pull-secrets\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:00.046330 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.044369 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert\") pod \"ingress-canary-whlxq\" (UID: \"da1e7104-4977-42cd-80d8-b8775e3a717b\") " pod="openshift-ingress-canary/ingress-canary-whlxq" Apr 17 17:08:00.046330 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.044403 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nzcx\" (UniqueName: \"kubernetes.io/projected/da1e7104-4977-42cd-80d8-b8775e3a717b-kube-api-access-4nzcx\") pod \"ingress-canary-whlxq\" (UID: \"da1e7104-4977-42cd-80d8-b8775e3a717b\") " pod="openshift-ingress-canary/ingress-canary-whlxq" Apr 17 17:08:00.046330 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.044431 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:08:00.046330 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.044465 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/76642cd9-fee5-4031-a695-f531ec9d1dcc-hub\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:08:00.047206 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.044550 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dk4f\" (UniqueName: \"kubernetes.io/projected/76642cd9-fee5-4031-a695-f531ec9d1dcc-kube-api-access-9dk4f\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:08:00.047206 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.044589 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-tmp-dir\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:08:00.047206 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.044620 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69xkc\" (UniqueName: \"kubernetes.io/projected/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-kube-api-access-69xkc\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:08:00.047206 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.044655 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hk7dk\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-kube-api-access-hk7dk\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:00.047206 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.044689 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/76642cd9-fee5-4031-a695-f531ec9d1dcc-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:08:00.047206 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.046955 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:00.047206 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.047020 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls podName:d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:00.547002737 +0000 UTC m=+34.163135998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls") pod "dns-default-6b8st" (UID: "d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897") : secret "dns-default-metrics-tls" not found Apr 17 17:08:00.047639 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.047438 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/138cec68-511a-467f-b5de-b2b7fc19fbb5-image-registry-private-configuration\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:00.047710 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.047674 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/76642cd9-fee5-4031-a695-f531ec9d1dcc-ca\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:08:00.049049 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.047798 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/138cec68-511a-467f-b5de-b2b7fc19fbb5-ca-trust-extracted\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:00.049049 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.047912 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:00.049049 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.047963 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert podName:da1e7104-4977-42cd-80d8-b8775e3a717b nodeName:}" failed. No retries permitted until 2026-04-17 17:08:00.547947852 +0000 UTC m=+34.164081096 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert") pod "ingress-canary-whlxq" (UID: "da1e7104-4977-42cd-80d8-b8775e3a717b") : secret "canary-serving-cert" not found Apr 17 17:08:00.049049 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.048548 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/76642cd9-fee5-4031-a695-f531ec9d1dcc-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:08:00.049049 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.048918 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-tmp-dir\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:08:00.049781 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.049408 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e5ec3f84-dd05-4c76-b51e-66780aa5f059-klusterlet-config\") pod \"klusterlet-addon-workmgr-cfddc449d-9htbx\" (UID: \"e5ec3f84-dd05-4c76-b51e-66780aa5f059\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" Apr 17 17:08:00.050531 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.050128 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/76642cd9-fee5-4031-a695-f531ec9d1dcc-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:08:00.051564 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.051535 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b183f9d4-46f5-4c01-84a9-342f742b9289-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-595ccf57f7-xzv8n\" (UID: \"b183f9d4-46f5-4c01-84a9-342f742b9289\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-595ccf57f7-xzv8n" Apr 17 17:08:00.051891 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.051866 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/76642cd9-fee5-4031-a695-f531ec9d1dcc-hub\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:08:00.052430 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.052407 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/138cec68-511a-467f-b5de-b2b7fc19fbb5-installation-pull-secrets\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:00.052829 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.052807 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-bound-sa-token\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:00.053947 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.053916 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psl2q\" (UniqueName: \"kubernetes.io/projected/e5ec3f84-dd05-4c76-b51e-66780aa5f059-kube-api-access-psl2q\") pod \"klusterlet-addon-workmgr-cfddc449d-9htbx\" (UID: \"e5ec3f84-dd05-4c76-b51e-66780aa5f059\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" Apr 17 17:08:00.054074 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.053971 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nzcx\" (UniqueName: \"kubernetes.io/projected/da1e7104-4977-42cd-80d8-b8775e3a717b-kube-api-access-4nzcx\") pod \"ingress-canary-whlxq\" (UID: \"da1e7104-4977-42cd-80d8-b8775e3a717b\") " pod="openshift-ingress-canary/ingress-canary-whlxq" Apr 17 17:08:00.054363 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.054342 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwshl\" (UniqueName: \"kubernetes.io/projected/b183f9d4-46f5-4c01-84a9-342f742b9289-kube-api-access-xwshl\") pod \"managed-serviceaccount-addon-agent-595ccf57f7-xzv8n\" (UID: \"b183f9d4-46f5-4c01-84a9-342f742b9289\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-595ccf57f7-xzv8n" Apr 17 17:08:00.057012 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.056986 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dk4f\" (UniqueName: \"kubernetes.io/projected/76642cd9-fee5-4031-a695-f531ec9d1dcc-kube-api-access-9dk4f\") pod \"cluster-proxy-proxy-agent-7bdd67b75d-q49dd\" (UID: \"76642cd9-fee5-4031-a695-f531ec9d1dcc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:08:00.057379 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.057359 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69xkc\" (UniqueName: \"kubernetes.io/projected/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-kube-api-access-69xkc\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:08:00.057448 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.057413 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk7dk\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-kube-api-access-hk7dk\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:00.130195 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.130154 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-595ccf57f7-xzv8n" Apr 17 17:08:00.147219 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.147190 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" Apr 17 17:08:00.154868 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.154846 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:08:00.549148 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.549115 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert\") pod \"ingress-canary-whlxq\" (UID: \"da1e7104-4977-42cd-80d8-b8775e3a717b\") " pod="openshift-ingress-canary/ingress-canary-whlxq" Apr 17 17:08:00.549148 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.549155 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:08:00.549419 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.549185 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntjc2\" (UniqueName: \"kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2\") pod \"network-check-target-z2wnp\" (UID: \"8e113e09-f336-4b95-a2e1-db3f043106af\") " pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:08:00.549419 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.549270 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs\") pod \"network-metrics-daemon-4xcb9\" (UID: \"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89\") " pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:08:00.549419 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.549291 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:00.549419 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.549324 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:08:00.549419 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.549341 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:08:00.549419 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.549352 2568 projected.go:194] Error preparing data for projected volume kube-api-access-ntjc2 for pod openshift-network-diagnostics/network-check-target-z2wnp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:08:00.549419 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.549358 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:00.549419 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.549378 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:00.549419 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.549422 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2 podName:8e113e09-f336-4b95-a2e1-db3f043106af nodeName:}" failed. No retries permitted until 2026-04-17 17:08:32.549400088 +0000 UTC m=+66.165533345 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ntjc2" (UniqueName: "kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2") pod "network-check-target-z2wnp" (UID: "8e113e09-f336-4b95-a2e1-db3f043106af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:08:00.549792 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.549444 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls podName:d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:01.549434094 +0000 UTC m=+35.165567338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls") pod "dns-default-6b8st" (UID: "d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897") : secret "dns-default-metrics-tls" not found Apr 17 17:08:00.549792 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.549461 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert podName:da1e7104-4977-42cd-80d8-b8775e3a717b nodeName:}" failed. No retries permitted until 2026-04-17 17:08:01.54945221 +0000 UTC m=+35.165585691 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert") pod "ingress-canary-whlxq" (UID: "da1e7104-4977-42cd-80d8-b8775e3a717b") : secret "canary-serving-cert" not found Apr 17 17:08:00.549792 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.549462 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:08:00.549792 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.549474 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f797f5f96-zsrx2: secret "image-registry-tls" not found Apr 17 17:08:00.549792 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.549483 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:08:00.549792 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.549521 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls podName:138cec68-511a-467f-b5de-b2b7fc19fbb5 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:01.549506129 +0000 UTC m=+35.165639391 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls") pod "image-registry-5f797f5f96-zsrx2" (UID: "138cec68-511a-467f-b5de-b2b7fc19fbb5") : secret "image-registry-tls" not found Apr 17 17:08:00.549792 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:00.549534 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs podName:62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:32.549527742 +0000 UTC m=+66.165660983 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs") pod "network-metrics-daemon-4xcb9" (UID: "62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89") : secret "metrics-daemon-secret" not found Apr 17 17:08:00.901458 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.901371 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:08:00.904336 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.904296 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:08:00.905420 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.905400 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-s65fp\"" Apr 17 17:08:00.905545 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:00.905424 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:08:01.556964 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:01.556921 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert\") pod \"ingress-canary-whlxq\" (UID: \"da1e7104-4977-42cd-80d8-b8775e3a717b\") " pod="openshift-ingress-canary/ingress-canary-whlxq" Apr 17 17:08:01.556964 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:01.556969 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:08:01.557179 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:01.557060 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:01.557179 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:01.557069 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:01.557179 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:01.557118 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls podName:d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:03.557103324 +0000 UTC m=+37.173236588 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls") pod "dns-default-6b8st" (UID: "d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897") : secret "dns-default-metrics-tls" not found Apr 17 17:08:01.557179 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:01.557138 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert podName:da1e7104-4977-42cd-80d8-b8775e3a717b nodeName:}" failed. No retries permitted until 2026-04-17 17:08:03.557130043 +0000 UTC m=+37.173263286 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert") pod "ingress-canary-whlxq" (UID: "da1e7104-4977-42cd-80d8-b8775e3a717b") : secret "canary-serving-cert" not found Apr 17 17:08:01.557396 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:01.557202 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:01.557396 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:01.557359 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:08:01.557396 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:01.557376 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f797f5f96-zsrx2: secret "image-registry-tls" not found Apr 17 17:08:01.557515 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:01.557432 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls podName:138cec68-511a-467f-b5de-b2b7fc19fbb5 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:03.557414298 +0000 UTC m=+37.173547540 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls") pod "image-registry-5f797f5f96-zsrx2" (UID: "138cec68-511a-467f-b5de-b2b7fc19fbb5") : secret "image-registry-tls" not found Apr 17 17:08:02.284980 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:02.284809 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-595ccf57f7-xzv8n"] Apr 17 17:08:02.291962 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:02.291941 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx"] Apr 17 17:08:02.299121 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:02.299098 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd"] Apr 17 17:08:02.316198 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:08:02.316174 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb183f9d4_46f5_4c01_84a9_342f742b9289.slice/crio-15af858e50870e9321b49d6d3783f12b5ef0cf776d3463b3d0456b1db2a8873d WatchSource:0}: Error finding container 15af858e50870e9321b49d6d3783f12b5ef0cf776d3463b3d0456b1db2a8873d: Status 404 returned error can't find the container with id 15af858e50870e9321b49d6d3783f12b5ef0cf776d3463b3d0456b1db2a8873d Apr 17 17:08:02.317239 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:08:02.317213 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5ec3f84_dd05_4c76_b51e_66780aa5f059.slice/crio-788ac28bd49cf8abc48627d89e463c26e5b91e40b60049f3818be341bd972478 WatchSource:0}: Error finding container 788ac28bd49cf8abc48627d89e463c26e5b91e40b60049f3818be341bd972478: Status 404 returned error can't find the container with id 788ac28bd49cf8abc48627d89e463c26e5b91e40b60049f3818be341bd972478 Apr 17 17:08:02.318029 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:08:02.318007 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76642cd9_fee5_4031_a695_f531ec9d1dcc.slice/crio-239fabd89c2c4272f425e7d714d1ad078e1a4f8e52e6ae6e7156d3fab1032380 WatchSource:0}: Error finding container 239fabd89c2c4272f425e7d714d1ad078e1a4f8e52e6ae6e7156d3fab1032380: Status 404 returned error can't find the container with id 239fabd89c2c4272f425e7d714d1ad078e1a4f8e52e6ae6e7156d3fab1032380 Apr 17 17:08:03.055497 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:03.055454 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-595ccf57f7-xzv8n" event={"ID":"b183f9d4-46f5-4c01-84a9-342f742b9289","Type":"ContainerStarted","Data":"15af858e50870e9321b49d6d3783f12b5ef0cf776d3463b3d0456b1db2a8873d"} Apr 17 17:08:03.056801 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:03.056732 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" event={"ID":"e5ec3f84-dd05-4c76-b51e-66780aa5f059","Type":"ContainerStarted","Data":"788ac28bd49cf8abc48627d89e463c26e5b91e40b60049f3818be341bd972478"} Apr 17 17:08:03.059631 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:03.059559 2568 generic.go:358] "Generic (PLEG): container finished" podID="8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2" containerID="9002fe8e66b2fe1770f90a41c444b051b1ba5f4ed08141fd9efa8e18c8aef80b" exitCode=0 Apr 17 17:08:03.059753 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:03.059622 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b66zf" event={"ID":"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2","Type":"ContainerDied","Data":"9002fe8e66b2fe1770f90a41c444b051b1ba5f4ed08141fd9efa8e18c8aef80b"} Apr 17 17:08:03.061046 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:03.061020 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" event={"ID":"76642cd9-fee5-4031-a695-f531ec9d1dcc","Type":"ContainerStarted","Data":"239fabd89c2c4272f425e7d714d1ad078e1a4f8e52e6ae6e7156d3fab1032380"} Apr 17 17:08:03.575393 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:03.575353 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert\") pod \"ingress-canary-whlxq\" (UID: \"da1e7104-4977-42cd-80d8-b8775e3a717b\") " pod="openshift-ingress-canary/ingress-canary-whlxq" Apr 17 17:08:03.575918 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:03.575403 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:08:03.575918 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:03.575480 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:03.575918 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:03.575615 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:08:03.575918 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:03.575629 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f797f5f96-zsrx2: secret "image-registry-tls" not found Apr 17 17:08:03.575918 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:03.575686 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls podName:138cec68-511a-467f-b5de-b2b7fc19fbb5 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:07.575668392 +0000 UTC m=+41.191801638 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls") pod "image-registry-5f797f5f96-zsrx2" (UID: "138cec68-511a-467f-b5de-b2b7fc19fbb5") : secret "image-registry-tls" not found Apr 17 17:08:03.576283 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:03.576074 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:03.576283 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:03.576125 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert podName:da1e7104-4977-42cd-80d8-b8775e3a717b nodeName:}" failed. No retries permitted until 2026-04-17 17:08:07.576108182 +0000 UTC m=+41.192241438 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert") pod "ingress-canary-whlxq" (UID: "da1e7104-4977-42cd-80d8-b8775e3a717b") : secret "canary-serving-cert" not found Apr 17 17:08:03.576283 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:03.576180 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:03.576283 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:03.576210 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls podName:d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:07.576200661 +0000 UTC m=+41.192333905 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls") pod "dns-default-6b8st" (UID: "d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897") : secret "dns-default-metrics-tls" not found Apr 17 17:08:04.069950 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:04.069905 2568 generic.go:358] "Generic (PLEG): container finished" podID="8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2" containerID="e0f482779bbc22df04f7a5e87ab4f4cb56b152b4db966e77ec4de27a7ea34776" exitCode=0 Apr 17 17:08:04.069950 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:04.069974 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b66zf" event={"ID":"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2","Type":"ContainerDied","Data":"e0f482779bbc22df04f7a5e87ab4f4cb56b152b4db966e77ec4de27a7ea34776"} Apr 17 17:08:05.076538 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:05.076502 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b66zf" event={"ID":"8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2","Type":"ContainerStarted","Data":"212f45edcfa1f1651ea3d2130c6bad4ab61fa104ca8a23c19dad82b51a019d20"} Apr 17 17:08:05.102268 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:05.102091 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b66zf" podStartSLOduration=6.317876466 podStartE2EDuration="39.102071988s" podCreationTimestamp="2026-04-17 17:07:26 +0000 UTC" firstStartedPulling="2026-04-17 17:07:29.56891507 +0000 UTC m=+3.185048315" lastFinishedPulling="2026-04-17 17:08:02.353110596 +0000 UTC m=+35.969243837" observedRunningTime="2026-04-17 17:08:05.099960388 +0000 UTC m=+38.716093679" watchObservedRunningTime="2026-04-17 17:08:05.102071988 +0000 UTC m=+38.718205251" Apr 17 17:08:07.613228 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:07.613185 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:07.613691 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:07.613283 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert\") pod \"ingress-canary-whlxq\" (UID: \"da1e7104-4977-42cd-80d8-b8775e3a717b\") " pod="openshift-ingress-canary/ingress-canary-whlxq" Apr 17 17:08:07.613691 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:07.613326 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:08:07.613691 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:07.613333 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:08:07.613691 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:07.613358 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f797f5f96-zsrx2: secret "image-registry-tls" not found Apr 17 17:08:07.613691 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:07.613414 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:07.613691 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:07.613421 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls podName:138cec68-511a-467f-b5de-b2b7fc19fbb5 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:15.613401404 +0000 UTC m=+49.229534674 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls") pod "image-registry-5f797f5f96-zsrx2" (UID: "138cec68-511a-467f-b5de-b2b7fc19fbb5") : secret "image-registry-tls" not found Apr 17 17:08:07.613691 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:07.613450 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert podName:da1e7104-4977-42cd-80d8-b8775e3a717b nodeName:}" failed. No retries permitted until 2026-04-17 17:08:15.613439005 +0000 UTC m=+49.229572257 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert") pod "ingress-canary-whlxq" (UID: "da1e7104-4977-42cd-80d8-b8775e3a717b") : secret "canary-serving-cert" not found Apr 17 17:08:07.613691 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:07.613468 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:07.613691 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:07.613522 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls podName:d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:15.61350664 +0000 UTC m=+49.229639887 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls") pod "dns-default-6b8st" (UID: "d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897") : secret "dns-default-metrics-tls" not found Apr 17 17:08:09.085116 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:09.085078 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-595ccf57f7-xzv8n" event={"ID":"b183f9d4-46f5-4c01-84a9-342f742b9289","Type":"ContainerStarted","Data":"cec3db4707b4ce0b91df81a0a2d3cf392661bbfdbe87a3dbc00a5c1c637fc765"} Apr 17 17:08:09.086413 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:09.086386 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" event={"ID":"e5ec3f84-dd05-4c76-b51e-66780aa5f059","Type":"ContainerStarted","Data":"1f7b6eac1269dbee08774743496b538ee6bcffd892925c5b5e5bbd17a4154a36"} Apr 17 17:08:09.086588 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:09.086569 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" Apr 17 17:08:09.087664 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:09.087641 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" event={"ID":"76642cd9-fee5-4031-a695-f531ec9d1dcc","Type":"ContainerStarted","Data":"3dd40248bf1850161912bc6657498cf51b0e11ab5b3412d2ea6d7061fa0d7842"} Apr 17 17:08:09.088348 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:09.088331 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" Apr 17 17:08:09.120894 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:09.120854 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-595ccf57f7-xzv8n" podStartSLOduration=32.11146708 podStartE2EDuration="38.120842539s" podCreationTimestamp="2026-04-17 17:07:31 +0000 UTC" firstStartedPulling="2026-04-17 17:08:02.329867163 +0000 UTC m=+35.946000407" lastFinishedPulling="2026-04-17 17:08:08.339242621 +0000 UTC m=+41.955375866" observedRunningTime="2026-04-17 17:08:09.102020703 +0000 UTC m=+42.718153968" watchObservedRunningTime="2026-04-17 17:08:09.120842539 +0000 UTC m=+42.736975779" Apr 17 17:08:09.121110 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:09.121073 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" podStartSLOduration=32.097851375 podStartE2EDuration="38.121069137s" podCreationTimestamp="2026-04-17 17:07:31 +0000 UTC" firstStartedPulling="2026-04-17 17:08:02.329721588 +0000 UTC m=+35.945854831" lastFinishedPulling="2026-04-17 17:08:08.35293935 +0000 UTC m=+41.969072593" observedRunningTime="2026-04-17 17:08:09.120224783 +0000 UTC m=+42.736358045" watchObservedRunningTime="2026-04-17 17:08:09.121069137 +0000 UTC m=+42.737202395" Apr 17 17:08:12.095552 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:12.095512 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" event={"ID":"76642cd9-fee5-4031-a695-f531ec9d1dcc","Type":"ContainerStarted","Data":"00318608a753b97920e62a039e04b6dca8a1e13902473f06224e9718a9617a9d"} Apr 17 17:08:12.095552 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:12.095555 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" event={"ID":"76642cd9-fee5-4031-a695-f531ec9d1dcc","Type":"ContainerStarted","Data":"2af4b4cf9c9e9060148d2128adba3e3dc290627f6fa5ee1761a274bb04e2de35"} Apr 17 17:08:12.113865 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:12.113822 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" podStartSLOduration=32.336009589 podStartE2EDuration="41.11380975s" podCreationTimestamp="2026-04-17 17:07:31 +0000 UTC" firstStartedPulling="2026-04-17 17:08:02.329720645 +0000 UTC m=+35.945853891" lastFinishedPulling="2026-04-17 17:08:11.107520811 +0000 UTC m=+44.723654052" observedRunningTime="2026-04-17 17:08:12.113360004 +0000 UTC m=+45.729493267" watchObservedRunningTime="2026-04-17 17:08:12.11380975 +0000 UTC m=+45.729943012" Apr 17 17:08:14.464373 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:14.464342 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret\") pod \"global-pull-secret-syncer-gfbn9\" (UID: \"fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0\") " pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:08:14.467722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:14.467702 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0-original-pull-secret\") pod \"global-pull-secret-syncer-gfbn9\" (UID: \"fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0\") " pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:08:14.613318 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:14.613286 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gfbn9" Apr 17 17:08:14.720791 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:14.720729 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gfbn9"] Apr 17 17:08:14.723699 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:08:14.723665 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcbb42f0_8a82_4eb2_8ab6_dca288c60dc0.slice/crio-f5d2ad2373be8e5fac8ec0791f3532608fc9b9b565e106b2c86e36c9e99aef8d WatchSource:0}: Error finding container f5d2ad2373be8e5fac8ec0791f3532608fc9b9b565e106b2c86e36c9e99aef8d: Status 404 returned error can't find the container with id f5d2ad2373be8e5fac8ec0791f3532608fc9b9b565e106b2c86e36c9e99aef8d Apr 17 17:08:15.102402 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:15.102371 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gfbn9" event={"ID":"fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0","Type":"ContainerStarted","Data":"f5d2ad2373be8e5fac8ec0791f3532608fc9b9b565e106b2c86e36c9e99aef8d"} Apr 17 17:08:15.673668 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:15.673630 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert\") pod \"ingress-canary-whlxq\" (UID: \"da1e7104-4977-42cd-80d8-b8775e3a717b\") " pod="openshift-ingress-canary/ingress-canary-whlxq" Apr 17 17:08:15.673668 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:15.673673 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:08:15.674172 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:15.673780 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:15.674172 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:15.673794 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:15.674172 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:15.673854 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls podName:d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:31.673827562 +0000 UTC m=+65.289960820 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls") pod "dns-default-6b8st" (UID: "d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897") : secret "dns-default-metrics-tls" not found Apr 17 17:08:15.674172 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:15.673888 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert podName:da1e7104-4977-42cd-80d8-b8775e3a717b nodeName:}" failed. No retries permitted until 2026-04-17 17:08:31.673878083 +0000 UTC m=+65.290011341 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert") pod "ingress-canary-whlxq" (UID: "da1e7104-4977-42cd-80d8-b8775e3a717b") : secret "canary-serving-cert" not found Apr 17 17:08:15.674172 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:15.673942 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:15.674172 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:15.674039 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:08:15.674172 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:15.674055 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f797f5f96-zsrx2: secret "image-registry-tls" not found Apr 17 17:08:15.674172 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:15.674103 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls podName:138cec68-511a-467f-b5de-b2b7fc19fbb5 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:31.67408626 +0000 UTC m=+65.290219510 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls") pod "image-registry-5f797f5f96-zsrx2" (UID: "138cec68-511a-467f-b5de-b2b7fc19fbb5") : secret "image-registry-tls" not found Apr 17 17:08:20.113295 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:20.113259 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gfbn9" event={"ID":"fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0","Type":"ContainerStarted","Data":"6c402f0693c9cc5369dd478c734895138b104261ad56a8b07392646d2faef2eb"} Apr 17 17:08:20.127562 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:20.127513 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gfbn9" podStartSLOduration=33.819765681 podStartE2EDuration="38.127499883s" podCreationTimestamp="2026-04-17 17:07:42 +0000 UTC" firstStartedPulling="2026-04-17 17:08:14.726010871 +0000 UTC m=+48.342144112" lastFinishedPulling="2026-04-17 17:08:19.03374507 +0000 UTC m=+52.649878314" observedRunningTime="2026-04-17 17:08:20.127413166 +0000 UTC m=+53.743546429" watchObservedRunningTime="2026-04-17 17:08:20.127499883 +0000 UTC m=+53.743633145" Apr 17 17:08:26.048117 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:26.048085 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-97rxf" Apr 17 17:08:31.695904 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:31.695867 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:08:31.696279 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:31.695921 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert\") pod \"ingress-canary-whlxq\" (UID: \"da1e7104-4977-42cd-80d8-b8775e3a717b\") " pod="openshift-ingress-canary/ingress-canary-whlxq" Apr 17 17:08:31.696279 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:31.695939 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:08:31.696279 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:31.696007 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:08:31.696279 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:31.696026 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f797f5f96-zsrx2: secret "image-registry-tls" not found Apr 17 17:08:31.696279 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:31.696079 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls podName:138cec68-511a-467f-b5de-b2b7fc19fbb5 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:03.696063985 +0000 UTC m=+97.312197225 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls") pod "image-registry-5f797f5f96-zsrx2" (UID: "138cec68-511a-467f-b5de-b2b7fc19fbb5") : secret "image-registry-tls" not found Apr 17 17:08:31.696279 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:31.696080 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:31.696279 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:31.696134 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:31.696279 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:31.696134 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert podName:da1e7104-4977-42cd-80d8-b8775e3a717b nodeName:}" failed. No retries permitted until 2026-04-17 17:09:03.696118522 +0000 UTC m=+97.312251765 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert") pod "ingress-canary-whlxq" (UID: "da1e7104-4977-42cd-80d8-b8775e3a717b") : secret "canary-serving-cert" not found Apr 17 17:08:31.696279 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:31.696177 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls podName:d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:03.696167814 +0000 UTC m=+97.312301058 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls") pod "dns-default-6b8st" (UID: "d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897") : secret "dns-default-metrics-tls" not found Apr 17 17:08:32.602790 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:32.602753 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntjc2\" (UniqueName: \"kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2\") pod \"network-check-target-z2wnp\" (UID: \"8e113e09-f336-4b95-a2e1-db3f043106af\") " pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:08:32.602790 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:32.602789 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs\") pod \"network-metrics-daemon-4xcb9\" (UID: \"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89\") " pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:08:32.602997 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:32.602897 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:08:32.602997 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:08:32.602961 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs podName:62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:36.602943934 +0000 UTC m=+130.219077195 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs") pod "network-metrics-daemon-4xcb9" (UID: "62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89") : secret "metrics-daemon-secret" not found Apr 17 17:08:32.605490 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:32.605472 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:08:32.616383 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:32.616366 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:08:32.626851 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:32.626826 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntjc2\" (UniqueName: \"kubernetes.io/projected/8e113e09-f336-4b95-a2e1-db3f043106af-kube-api-access-ntjc2\") pod \"network-check-target-z2wnp\" (UID: \"8e113e09-f336-4b95-a2e1-db3f043106af\") " pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:08:32.715085 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:32.715057 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-s65fp\"" Apr 17 17:08:32.722067 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:32.722052 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:08:32.840120 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:32.840048 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z2wnp"] Apr 17 17:08:32.842694 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:08:32.842664 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e113e09_f336_4b95_a2e1_db3f043106af.slice/crio-21a872891281c2926366ff643262534a1d3130805190256bd4d456881a047d31 WatchSource:0}: Error finding container 21a872891281c2926366ff643262534a1d3130805190256bd4d456881a047d31: Status 404 returned error can't find the container with id 21a872891281c2926366ff643262534a1d3130805190256bd4d456881a047d31 Apr 17 17:08:33.146244 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:33.146210 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z2wnp" event={"ID":"8e113e09-f336-4b95-a2e1-db3f043106af","Type":"ContainerStarted","Data":"21a872891281c2926366ff643262534a1d3130805190256bd4d456881a047d31"} Apr 17 17:08:36.157254 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:36.157225 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z2wnp" event={"ID":"8e113e09-f336-4b95-a2e1-db3f043106af","Type":"ContainerStarted","Data":"4bef3586649eed91ec498bd21e14e5d8b302f8c27a90a68a3283e1e2e321b7f6"} Apr 17 17:08:36.157680 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:36.157345 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:08:36.176042 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:08:36.176001 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-z2wnp" podStartSLOduration=66.050379632 podStartE2EDuration="1m9.175988243s" podCreationTimestamp="2026-04-17 17:07:27 +0000 UTC" firstStartedPulling="2026-04-17 17:08:32.845176753 +0000 UTC m=+66.461309994" lastFinishedPulling="2026-04-17 17:08:35.970785363 +0000 UTC m=+69.586918605" observedRunningTime="2026-04-17 17:08:36.175363203 +0000 UTC m=+69.791496467" watchObservedRunningTime="2026-04-17 17:08:36.175988243 +0000 UTC m=+69.792121507" Apr 17 17:09:03.727405 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:09:03.727276 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:09:03.727405 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:09:03.727381 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert\") pod \"ingress-canary-whlxq\" (UID: \"da1e7104-4977-42cd-80d8-b8775e3a717b\") " pod="openshift-ingress-canary/ingress-canary-whlxq" Apr 17 17:09:03.727939 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:09:03.727411 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:09:03.727939 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:09:03.727439 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:09:03.727939 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:09:03.727455 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f797f5f96-zsrx2: secret "image-registry-tls" not found Apr 17 17:09:03.727939 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:09:03.727514 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls podName:138cec68-511a-467f-b5de-b2b7fc19fbb5 nodeName:}" failed. No retries permitted until 2026-04-17 17:10:07.72749569 +0000 UTC m=+161.343628932 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls") pod "image-registry-5f797f5f96-zsrx2" (UID: "138cec68-511a-467f-b5de-b2b7fc19fbb5") : secret "image-registry-tls" not found Apr 17 17:09:03.727939 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:09:03.727530 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:09:03.727939 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:09:03.727545 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:09:03.727939 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:09:03.727579 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls podName:d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897 nodeName:}" failed. No retries permitted until 2026-04-17 17:10:07.727568305 +0000 UTC m=+161.343701551 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls") pod "dns-default-6b8st" (UID: "d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897") : secret "dns-default-metrics-tls" not found Apr 17 17:09:03.727939 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:09:03.727614 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert podName:da1e7104-4977-42cd-80d8-b8775e3a717b nodeName:}" failed. No retries permitted until 2026-04-17 17:10:07.727597356 +0000 UTC m=+161.343730612 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert") pod "ingress-canary-whlxq" (UID: "da1e7104-4977-42cd-80d8-b8775e3a717b") : secret "canary-serving-cert" not found Apr 17 17:09:07.161682 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:09:07.161653 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-z2wnp" Apr 17 17:09:31.384706 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:09:31.384678 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4vjjz_fa447df9-716a-47c2-9ffb-b819a566f787/dns-node-resolver/0.log" Apr 17 17:09:32.385194 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:09:32.385166 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j5pdx_3868d9bd-d06e-4b37-89c0-ab0c05df3fff/node-ca/0.log" Apr 17 17:09:36.671043 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:09:36.670997 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs\") pod \"network-metrics-daemon-4xcb9\" (UID: \"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89\") " pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:09:36.671469 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:09:36.671134 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:09:36.671469 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:09:36.671202 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs podName:62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89 nodeName:}" failed. No retries permitted until 2026-04-17 17:11:38.671183515 +0000 UTC m=+252.287316756 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs") pod "network-metrics-daemon-4xcb9" (UID: "62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89") : secret "metrics-daemon-secret" not found Apr 17 17:10:02.839589 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:10:02.839550 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" podUID="138cec68-511a-467f-b5de-b2b7fc19fbb5" Apr 17 17:10:02.868985 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:10:02.868950 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-whlxq" podUID="da1e7104-4977-42cd-80d8-b8775e3a717b" Apr 17 17:10:02.876082 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:10:02.876058 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6b8st" podUID="d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897" Apr 17 17:10:02.921183 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:10:02.921144 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-4xcb9" podUID="62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89" Apr 17 17:10:03.352335 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:03.352292 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-whlxq" Apr 17 17:10:03.352335 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:03.352330 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:10:03.352571 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:03.352397 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6b8st" Apr 17 17:10:06.455780 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.455743 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-qb4fz"] Apr 17 17:10:06.458790 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.458771 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qb4fz" Apr 17 17:10:06.463713 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.463692 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:10:06.464391 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.464371 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hg6xj\"" Apr 17 17:10:06.464619 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.464602 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:10:06.465087 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.465071 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:10:06.465708 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.465690 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:10:06.472687 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.472669 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qb4fz"] Apr 17 17:10:06.480658 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.480637 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3ae13ab2-ef52-4063-a1b8-20d01ad15774-crio-socket\") pod \"insights-runtime-extractor-qb4fz\" (UID: \"3ae13ab2-ef52-4063-a1b8-20d01ad15774\") " pod="openshift-insights/insights-runtime-extractor-qb4fz" Apr 17 17:10:06.480757 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.480693 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3ae13ab2-ef52-4063-a1b8-20d01ad15774-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qb4fz\" (UID: \"3ae13ab2-ef52-4063-a1b8-20d01ad15774\") " pod="openshift-insights/insights-runtime-extractor-qb4fz" Apr 17 17:10:06.480757 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.480717 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3ae13ab2-ef52-4063-a1b8-20d01ad15774-data-volume\") pod \"insights-runtime-extractor-qb4fz\" (UID: \"3ae13ab2-ef52-4063-a1b8-20d01ad15774\") " pod="openshift-insights/insights-runtime-extractor-qb4fz" Apr 17 17:10:06.480858 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.480781 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3ae13ab2-ef52-4063-a1b8-20d01ad15774-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qb4fz\" (UID: \"3ae13ab2-ef52-4063-a1b8-20d01ad15774\") " pod="openshift-insights/insights-runtime-extractor-qb4fz" Apr 17 17:10:06.480912 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.480890 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ddvj\" (UniqueName: \"kubernetes.io/projected/3ae13ab2-ef52-4063-a1b8-20d01ad15774-kube-api-access-5ddvj\") pod \"insights-runtime-extractor-qb4fz\" (UID: \"3ae13ab2-ef52-4063-a1b8-20d01ad15774\") " pod="openshift-insights/insights-runtime-extractor-qb4fz" Apr 17 17:10:06.581731 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.581703 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3ae13ab2-ef52-4063-a1b8-20d01ad15774-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qb4fz\" (UID: \"3ae13ab2-ef52-4063-a1b8-20d01ad15774\") " pod="openshift-insights/insights-runtime-extractor-qb4fz" Apr 17 17:10:06.581883 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.581740 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3ae13ab2-ef52-4063-a1b8-20d01ad15774-data-volume\") pod \"insights-runtime-extractor-qb4fz\" (UID: \"3ae13ab2-ef52-4063-a1b8-20d01ad15774\") " pod="openshift-insights/insights-runtime-extractor-qb4fz" Apr 17 17:10:06.581883 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.581765 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3ae13ab2-ef52-4063-a1b8-20d01ad15774-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qb4fz\" (UID: \"3ae13ab2-ef52-4063-a1b8-20d01ad15774\") " pod="openshift-insights/insights-runtime-extractor-qb4fz" Apr 17 17:10:06.581883 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.581805 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ddvj\" (UniqueName: \"kubernetes.io/projected/3ae13ab2-ef52-4063-a1b8-20d01ad15774-kube-api-access-5ddvj\") pod \"insights-runtime-extractor-qb4fz\" (UID: \"3ae13ab2-ef52-4063-a1b8-20d01ad15774\") " pod="openshift-insights/insights-runtime-extractor-qb4fz" Apr 17 17:10:06.581883 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.581826 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3ae13ab2-ef52-4063-a1b8-20d01ad15774-crio-socket\") pod \"insights-runtime-extractor-qb4fz\" (UID: \"3ae13ab2-ef52-4063-a1b8-20d01ad15774\") " pod="openshift-insights/insights-runtime-extractor-qb4fz" Apr 17 17:10:06.582098 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.581907 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3ae13ab2-ef52-4063-a1b8-20d01ad15774-crio-socket\") pod \"insights-runtime-extractor-qb4fz\" (UID: \"3ae13ab2-ef52-4063-a1b8-20d01ad15774\") " pod="openshift-insights/insights-runtime-extractor-qb4fz" Apr 17 17:10:06.582150 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.582104 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3ae13ab2-ef52-4063-a1b8-20d01ad15774-data-volume\") pod \"insights-runtime-extractor-qb4fz\" (UID: \"3ae13ab2-ef52-4063-a1b8-20d01ad15774\") " pod="openshift-insights/insights-runtime-extractor-qb4fz" Apr 17 17:10:06.582339 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.582299 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3ae13ab2-ef52-4063-a1b8-20d01ad15774-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qb4fz\" (UID: \"3ae13ab2-ef52-4063-a1b8-20d01ad15774\") " pod="openshift-insights/insights-runtime-extractor-qb4fz" Apr 17 17:10:06.583937 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.583921 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3ae13ab2-ef52-4063-a1b8-20d01ad15774-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qb4fz\" (UID: \"3ae13ab2-ef52-4063-a1b8-20d01ad15774\") " pod="openshift-insights/insights-runtime-extractor-qb4fz" Apr 17 17:10:06.593562 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.593538 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ddvj\" (UniqueName: \"kubernetes.io/projected/3ae13ab2-ef52-4063-a1b8-20d01ad15774-kube-api-access-5ddvj\") pod \"insights-runtime-extractor-qb4fz\" (UID: \"3ae13ab2-ef52-4063-a1b8-20d01ad15774\") " pod="openshift-insights/insights-runtime-extractor-qb4fz" Apr 17 17:10:06.767477 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.767447 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qb4fz" Apr 17 17:10:06.879016 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:06.878975 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qb4fz"] Apr 17 17:10:06.882656 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:10:06.882630 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ae13ab2_ef52_4063_a1b8_20d01ad15774.slice/crio-9e6385faba104df68c9869e3ddbbbcd2aa8747774080543adbe4e47e80aab866 WatchSource:0}: Error finding container 9e6385faba104df68c9869e3ddbbbcd2aa8747774080543adbe4e47e80aab866: Status 404 returned error can't find the container with id 9e6385faba104df68c9869e3ddbbbcd2aa8747774080543adbe4e47e80aab866 Apr 17 17:10:07.362932 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:07.362899 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qb4fz" event={"ID":"3ae13ab2-ef52-4063-a1b8-20d01ad15774","Type":"ContainerStarted","Data":"34cd0ee9ab87b13697f44185b007a0daad540097e9c5648e14e996c7253808a6"} Apr 17 17:10:07.362932 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:07.362934 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qb4fz" event={"ID":"3ae13ab2-ef52-4063-a1b8-20d01ad15774","Type":"ContainerStarted","Data":"9e6385faba104df68c9869e3ddbbbcd2aa8747774080543adbe4e47e80aab866"} Apr 17 17:10:07.789873 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:07.789842 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:10:07.790270 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:07.789898 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert\") pod \"ingress-canary-whlxq\" (UID: \"da1e7104-4977-42cd-80d8-b8775e3a717b\") " pod="openshift-ingress-canary/ingress-canary-whlxq" Apr 17 17:10:07.790270 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:07.789923 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:10:07.792144 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:07.792124 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da1e7104-4977-42cd-80d8-b8775e3a717b-cert\") pod \"ingress-canary-whlxq\" (UID: \"da1e7104-4977-42cd-80d8-b8775e3a717b\") " pod="openshift-ingress-canary/ingress-canary-whlxq" Apr 17 17:10:07.792228 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:07.792150 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls\") pod \"image-registry-5f797f5f96-zsrx2\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:10:07.792552 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:07.792532 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897-metrics-tls\") pod \"dns-default-6b8st\" (UID: \"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897\") " pod="openshift-dns/dns-default-6b8st" Apr 17 17:10:07.856892 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:07.856841 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qdqk7\"" Apr 17 17:10:07.856892 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:07.856841 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-b4xpj\"" Apr 17 17:10:07.857044 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:07.856868 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j45n5\"" Apr 17 17:10:07.864021 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:07.863974 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:10:07.864021 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:07.864003 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6b8st" Apr 17 17:10:07.864265 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:07.863978 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-whlxq" Apr 17 17:10:08.021534 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:08.021500 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5f797f5f96-zsrx2"] Apr 17 17:10:08.024424 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:10:08.024400 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod138cec68_511a_467f_b5de_b2b7fc19fbb5.slice/crio-f2bcc94e872babd69c02f440bf68b4650d5e9b096e421607415b3a74ee389c2e WatchSource:0}: Error finding container f2bcc94e872babd69c02f440bf68b4650d5e9b096e421607415b3a74ee389c2e: Status 404 returned error can't find the container with id f2bcc94e872babd69c02f440bf68b4650d5e9b096e421607415b3a74ee389c2e Apr 17 17:10:08.230321 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:08.230222 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-whlxq"] Apr 17 17:10:08.233176 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:10:08.233145 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda1e7104_4977_42cd_80d8_b8775e3a717b.slice/crio-5ac492c5937438fdbe7b98b0401d1abffcfabc9f82ddbd3721bf9e0b05b2efe3 WatchSource:0}: Error finding container 5ac492c5937438fdbe7b98b0401d1abffcfabc9f82ddbd3721bf9e0b05b2efe3: Status 404 returned error can't find the container with id 5ac492c5937438fdbe7b98b0401d1abffcfabc9f82ddbd3721bf9e0b05b2efe3 Apr 17 17:10:08.237486 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:08.237358 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6b8st"] Apr 17 17:10:08.240254 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:10:08.240230 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd86d134d_5cb6_4c1b_8ec9_0bc4aa4b3897.slice/crio-e84b989d8efdc6a20ec59d9850d73a2055bd6f39e16bb2ce8a9df75432d688b4 WatchSource:0}: Error finding container e84b989d8efdc6a20ec59d9850d73a2055bd6f39e16bb2ce8a9df75432d688b4: Status 404 returned error can't find the container with id e84b989d8efdc6a20ec59d9850d73a2055bd6f39e16bb2ce8a9df75432d688b4 Apr 17 17:10:08.367606 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:08.367505 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" event={"ID":"138cec68-511a-467f-b5de-b2b7fc19fbb5","Type":"ContainerStarted","Data":"9085f93c8575f4461dc1b461b4bd4704acd83f37dcb804c390afe0161d7752e0"} Apr 17 17:10:08.367606 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:08.367549 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" event={"ID":"138cec68-511a-467f-b5de-b2b7fc19fbb5","Type":"ContainerStarted","Data":"f2bcc94e872babd69c02f440bf68b4650d5e9b096e421607415b3a74ee389c2e"} Apr 17 17:10:08.367856 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:08.367640 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:10:08.368676 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:08.368652 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6b8st" event={"ID":"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897","Type":"ContainerStarted","Data":"e84b989d8efdc6a20ec59d9850d73a2055bd6f39e16bb2ce8a9df75432d688b4"} Apr 17 17:10:08.370384 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:08.370331 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qb4fz" event={"ID":"3ae13ab2-ef52-4063-a1b8-20d01ad15774","Type":"ContainerStarted","Data":"c32d0ae1d16b9f44d638a21a3aa3f3aa59ec61a6c2464ddf7d30faaafd6802d8"} Apr 17 17:10:08.371353 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:08.371332 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-whlxq" event={"ID":"da1e7104-4977-42cd-80d8-b8775e3a717b","Type":"ContainerStarted","Data":"5ac492c5937438fdbe7b98b0401d1abffcfabc9f82ddbd3721bf9e0b05b2efe3"} Apr 17 17:10:08.387381 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:08.387333 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" podStartSLOduration=161.387299187 podStartE2EDuration="2m41.387299187s" podCreationTimestamp="2026-04-17 17:07:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:10:08.385872487 +0000 UTC m=+162.002005779" watchObservedRunningTime="2026-04-17 17:10:08.387299187 +0000 UTC m=+162.003432454" Apr 17 17:10:09.087758 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:09.087689 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" podUID="e5ec3f84-dd05-4c76-b51e-66780aa5f059" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.8:8000/readyz\": dial tcp 10.132.0.8:8000: connect: connection refused" Apr 17 17:10:09.376818 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:09.376728 2568 generic.go:358] "Generic (PLEG): container finished" podID="e5ec3f84-dd05-4c76-b51e-66780aa5f059" containerID="1f7b6eac1269dbee08774743496b538ee6bcffd892925c5b5e5bbd17a4154a36" exitCode=1 Apr 17 17:10:09.376987 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:09.376809 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" event={"ID":"e5ec3f84-dd05-4c76-b51e-66780aa5f059","Type":"ContainerDied","Data":"1f7b6eac1269dbee08774743496b538ee6bcffd892925c5b5e5bbd17a4154a36"} Apr 17 17:10:09.377253 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:09.377218 2568 scope.go:117] "RemoveContainer" containerID="1f7b6eac1269dbee08774743496b538ee6bcffd892925c5b5e5bbd17a4154a36" Apr 17 17:10:09.378434 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:09.378407 2568 generic.go:358] "Generic (PLEG): container finished" podID="b183f9d4-46f5-4c01-84a9-342f742b9289" containerID="cec3db4707b4ce0b91df81a0a2d3cf392661bbfdbe87a3dbc00a5c1c637fc765" exitCode=255 Apr 17 17:10:09.378563 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:09.378488 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-595ccf57f7-xzv8n" event={"ID":"b183f9d4-46f5-4c01-84a9-342f742b9289","Type":"ContainerDied","Data":"cec3db4707b4ce0b91df81a0a2d3cf392661bbfdbe87a3dbc00a5c1c637fc765"} Apr 17 17:10:09.378896 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:09.378879 2568 scope.go:117] "RemoveContainer" containerID="cec3db4707b4ce0b91df81a0a2d3cf392661bbfdbe87a3dbc00a5c1c637fc765" Apr 17 17:10:10.130756 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:10.130733 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-595ccf57f7-xzv8n" Apr 17 17:10:10.148113 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:10.148091 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" Apr 17 17:10:10.393928 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:10.393852 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qb4fz" event={"ID":"3ae13ab2-ef52-4063-a1b8-20d01ad15774","Type":"ContainerStarted","Data":"6abc4881018e4e2f8df13eff26b0081a735eaf1c791ac0d635f507c741f0aef5"} Apr 17 17:10:10.395488 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:10.395445 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-whlxq" event={"ID":"da1e7104-4977-42cd-80d8-b8775e3a717b","Type":"ContainerStarted","Data":"c20f6158e76884e2f115cfb0edf21f4e7498fb2539010612e25d0618dd52cd96"} Apr 17 17:10:10.397039 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:10.396996 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-595ccf57f7-xzv8n" event={"ID":"b183f9d4-46f5-4c01-84a9-342f742b9289","Type":"ContainerStarted","Data":"2ec90e5e3499e0da63e6acf942d1094a4a7743e4b5dd2269a96f2724e0e1b0ae"} Apr 17 17:10:10.399105 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:10.399076 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6b8st" event={"ID":"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897","Type":"ContainerStarted","Data":"771d383668171a8d4eb9e720eca41e07b283ff4a73c27fbf87f1256ef8d35dc8"} Apr 17 17:10:10.402973 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:10.402200 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" event={"ID":"e5ec3f84-dd05-4c76-b51e-66780aa5f059","Type":"ContainerStarted","Data":"c2ef72546ee5aab067671edf3997c4b42b7795abbd56ad0fac245c5ec78206b0"} Apr 17 17:10:10.402973 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:10.402535 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" Apr 17 17:10:10.403132 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:10.403113 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cfddc449d-9htbx" Apr 17 17:10:10.410571 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:10.410532 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-qb4fz" podStartSLOduration=1.190411115 podStartE2EDuration="4.410519777s" podCreationTimestamp="2026-04-17 17:10:06 +0000 UTC" firstStartedPulling="2026-04-17 17:10:06.939883347 +0000 UTC m=+160.556016589" lastFinishedPulling="2026-04-17 17:10:10.159991996 +0000 UTC m=+163.776125251" observedRunningTime="2026-04-17 17:10:10.410159783 +0000 UTC m=+164.026293062" watchObservedRunningTime="2026-04-17 17:10:10.410519777 +0000 UTC m=+164.026653041" Apr 17 17:10:10.458085 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:10.458045 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-whlxq" podStartSLOduration=129.528159457 podStartE2EDuration="2m11.458031268s" podCreationTimestamp="2026-04-17 17:07:59 +0000 UTC" firstStartedPulling="2026-04-17 17:10:08.235427559 +0000 UTC m=+161.851560813" lastFinishedPulling="2026-04-17 17:10:10.16529937 +0000 UTC m=+163.781432624" observedRunningTime="2026-04-17 17:10:10.457110676 +0000 UTC m=+164.073243939" watchObservedRunningTime="2026-04-17 17:10:10.458031268 +0000 UTC m=+164.074164531" Apr 17 17:10:11.405864 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:11.405824 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6b8st" event={"ID":"d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897","Type":"ContainerStarted","Data":"1311495d5bfca4e13578fb552b3a384b6c5e81c6d256c243fd791d593d03f7db"} Apr 17 17:10:11.406290 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:11.405998 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6b8st" Apr 17 17:10:11.425292 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:11.425250 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6b8st" podStartSLOduration=130.500200748 podStartE2EDuration="2m12.425238921s" podCreationTimestamp="2026-04-17 17:07:59 +0000 UTC" firstStartedPulling="2026-04-17 17:10:08.242807486 +0000 UTC m=+161.858940730" lastFinishedPulling="2026-04-17 17:10:10.167845658 +0000 UTC m=+163.783978903" observedRunningTime="2026-04-17 17:10:11.425067375 +0000 UTC m=+165.041200638" watchObservedRunningTime="2026-04-17 17:10:11.425238921 +0000 UTC m=+165.041372182" Apr 17 17:10:15.376202 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.376170 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qppqk"] Apr 17 17:10:15.383400 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.383370 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.386806 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.386783 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:10:15.386939 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.386867 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:10:15.386939 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.386896 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wcb6d\"" Apr 17 17:10:15.386939 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.386896 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:10:15.388087 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.388069 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:10:15.388217 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.388091 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:10:15.388217 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.388100 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:10:15.448469 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.448443 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02d2dd0a-d826-404c-9b25-1599d2485324-sys\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.448599 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.448475 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-textfile\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.448599 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.448492 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkjn7\" (UniqueName: \"kubernetes.io/projected/02d2dd0a-d826-404c-9b25-1599d2485324-kube-api-access-dkjn7\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.448599 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.448509 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-tls\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.448599 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.448562 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/02d2dd0a-d826-404c-9b25-1599d2485324-root\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.448599 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.448580 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-wtmp\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.448811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.448647 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02d2dd0a-d826-404c-9b25-1599d2485324-metrics-client-ca\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.448811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.448673 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.448811 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.448692 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-accelerators-collector-config\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.549312 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.549271 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-textfile\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.549429 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.549298 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkjn7\" (UniqueName: \"kubernetes.io/projected/02d2dd0a-d826-404c-9b25-1599d2485324-kube-api-access-dkjn7\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.549429 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.549343 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-tls\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.549429 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.549365 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/02d2dd0a-d826-404c-9b25-1599d2485324-root\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.549429 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.549387 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-wtmp\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.549429 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.549424 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02d2dd0a-d826-404c-9b25-1599d2485324-metrics-client-ca\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.549668 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.549446 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.549668 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.549472 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/02d2dd0a-d826-404c-9b25-1599d2485324-root\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.549668 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.549475 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-accelerators-collector-config\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.549668 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.549553 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02d2dd0a-d826-404c-9b25-1599d2485324-sys\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.549668 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.549586 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-wtmp\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.549668 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:10:15.549632 2568 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:10:15.549935 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.549634 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02d2dd0a-d826-404c-9b25-1599d2485324-sys\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.549935 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:10:15.549715 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-tls podName:02d2dd0a-d826-404c-9b25-1599d2485324 nodeName:}" failed. No retries permitted until 2026-04-17 17:10:16.049683279 +0000 UTC m=+169.665816525 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-tls") pod "node-exporter-qppqk" (UID: "02d2dd0a-d826-404c-9b25-1599d2485324") : secret "node-exporter-tls" not found Apr 17 17:10:15.549935 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.549794 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-textfile\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.550075 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.550012 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02d2dd0a-d826-404c-9b25-1599d2485324-metrics-client-ca\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.550075 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.550021 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-accelerators-collector-config\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.551593 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.551574 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.560736 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.560710 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkjn7\" (UniqueName: \"kubernetes.io/projected/02d2dd0a-d826-404c-9b25-1599d2485324-kube-api-access-dkjn7\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:15.901246 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:15.901215 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:10:16.052556 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:16.052525 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-tls\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:16.054635 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:16.054615 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/02d2dd0a-d826-404c-9b25-1599d2485324-node-exporter-tls\") pod \"node-exporter-qppqk\" (UID: \"02d2dd0a-d826-404c-9b25-1599d2485324\") " pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:16.292795 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:16.292766 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qppqk" Apr 17 17:10:16.300414 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:10:16.300388 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02d2dd0a_d826_404c_9b25_1599d2485324.slice/crio-db85c69a2a305198b8d6b656f0b055d77a1a32e0c417e5f3f1e28997cb5b0364 WatchSource:0}: Error finding container db85c69a2a305198b8d6b656f0b055d77a1a32e0c417e5f3f1e28997cb5b0364: Status 404 returned error can't find the container with id db85c69a2a305198b8d6b656f0b055d77a1a32e0c417e5f3f1e28997cb5b0364 Apr 17 17:10:16.418284 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:16.418214 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qppqk" event={"ID":"02d2dd0a-d826-404c-9b25-1599d2485324","Type":"ContainerStarted","Data":"db85c69a2a305198b8d6b656f0b055d77a1a32e0c417e5f3f1e28997cb5b0364"} Apr 17 17:10:17.422214 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:17.422137 2568 generic.go:358] "Generic (PLEG): container finished" podID="02d2dd0a-d826-404c-9b25-1599d2485324" containerID="0da14f19b60a51159b4c6fae0608ff6d5c1aa7306c70d5c5cbd11b7d30eba67a" exitCode=0 Apr 17 17:10:17.422214 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:17.422200 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qppqk" event={"ID":"02d2dd0a-d826-404c-9b25-1599d2485324","Type":"ContainerDied","Data":"0da14f19b60a51159b4c6fae0608ff6d5c1aa7306c70d5c5cbd11b7d30eba67a"} Apr 17 17:10:18.426539 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:18.426505 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qppqk" event={"ID":"02d2dd0a-d826-404c-9b25-1599d2485324","Type":"ContainerStarted","Data":"ee5e69b2a285597170931a82c9830ba44b3f37da0681f34e6aefe0d1bc834c69"} Apr 17 17:10:18.426539 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:18.426539 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qppqk" event={"ID":"02d2dd0a-d826-404c-9b25-1599d2485324","Type":"ContainerStarted","Data":"dff2298f667452e6f8bf1bbfc3dbb4076a5d20097d966ba0de7f333438a9cb86"} Apr 17 17:10:18.445514 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:18.445473 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qppqk" podStartSLOduration=2.585675487 podStartE2EDuration="3.445459957s" podCreationTimestamp="2026-04-17 17:10:15 +0000 UTC" firstStartedPulling="2026-04-17 17:10:16.301860563 +0000 UTC m=+169.917993805" lastFinishedPulling="2026-04-17 17:10:17.161645034 +0000 UTC m=+170.777778275" observedRunningTime="2026-04-17 17:10:18.443633228 +0000 UTC m=+172.059766485" watchObservedRunningTime="2026-04-17 17:10:18.445459957 +0000 UTC m=+172.061593220" Apr 17 17:10:21.409747 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:21.409709 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6b8st" Apr 17 17:10:27.868345 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:27.868300 2568 patch_prober.go:28] interesting pod/image-registry-5f797f5f96-zsrx2 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 17:10:27.868718 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:27.868365 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" podUID="138cec68-511a-467f-b5de-b2b7fc19fbb5" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:10:28.481161 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:28.481131 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5f797f5f96-zsrx2"] Apr 17 17:10:28.485067 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:28.485040 2568 patch_prober.go:28] interesting pod/image-registry-5f797f5f96-zsrx2 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 17:10:28.485182 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:28.485080 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" podUID="138cec68-511a-467f-b5de-b2b7fc19fbb5" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:10:38.485284 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:38.485255 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:10:39.996221 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:39.996189 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6b8st_d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897/dns/0.log" Apr 17 17:10:40.195582 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:40.195553 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6b8st_d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897/kube-rbac-proxy/0.log" Apr 17 17:10:41.195450 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:41.195422 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4vjjz_fa447df9-716a-47c2-9ffb-b819a566f787/dns-node-resolver/0.log" Apr 17 17:10:42.396525 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:42.396497 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-whlxq_da1e7104-4977-42cd-80d8-b8775e3a717b/serve-healthcheck-canary/0.log" Apr 17 17:10:50.156690 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:50.156650 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" podUID="76642cd9-fee5-4031-a695-f531ec9d1dcc" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 17:10:53.499766 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.499706 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" podUID="138cec68-511a-467f-b5de-b2b7fc19fbb5" containerName="registry" containerID="cri-o://9085f93c8575f4461dc1b461b4bd4704acd83f37dcb804c390afe0161d7752e0" gracePeriod=30 Apr 17 17:10:53.736273 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.736251 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:10:53.842522 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.842490 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-bound-sa-token\") pod \"138cec68-511a-467f-b5de-b2b7fc19fbb5\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " Apr 17 17:10:53.842522 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.842525 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk7dk\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-kube-api-access-hk7dk\") pod \"138cec68-511a-467f-b5de-b2b7fc19fbb5\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " Apr 17 17:10:53.842767 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.842548 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-certificates\") pod \"138cec68-511a-467f-b5de-b2b7fc19fbb5\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " Apr 17 17:10:53.842767 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.842569 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/138cec68-511a-467f-b5de-b2b7fc19fbb5-trusted-ca\") pod \"138cec68-511a-467f-b5de-b2b7fc19fbb5\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " Apr 17 17:10:53.842767 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.842612 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls\") pod \"138cec68-511a-467f-b5de-b2b7fc19fbb5\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " Apr 17 17:10:53.842767 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.842698 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/138cec68-511a-467f-b5de-b2b7fc19fbb5-image-registry-private-configuration\") pod \"138cec68-511a-467f-b5de-b2b7fc19fbb5\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " Apr 17 17:10:53.842767 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.842765 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/138cec68-511a-467f-b5de-b2b7fc19fbb5-installation-pull-secrets\") pod \"138cec68-511a-467f-b5de-b2b7fc19fbb5\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " Apr 17 17:10:53.843016 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.842815 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/138cec68-511a-467f-b5de-b2b7fc19fbb5-ca-trust-extracted\") pod \"138cec68-511a-467f-b5de-b2b7fc19fbb5\" (UID: \"138cec68-511a-467f-b5de-b2b7fc19fbb5\") " Apr 17 17:10:53.843016 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.842989 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "138cec68-511a-467f-b5de-b2b7fc19fbb5" (UID: "138cec68-511a-467f-b5de-b2b7fc19fbb5"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:10:53.843126 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.843062 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/138cec68-511a-467f-b5de-b2b7fc19fbb5-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "138cec68-511a-467f-b5de-b2b7fc19fbb5" (UID: "138cec68-511a-467f-b5de-b2b7fc19fbb5"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:10:53.845475 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.845446 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "138cec68-511a-467f-b5de-b2b7fc19fbb5" (UID: "138cec68-511a-467f-b5de-b2b7fc19fbb5"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:10:53.845475 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.845449 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/138cec68-511a-467f-b5de-b2b7fc19fbb5-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "138cec68-511a-467f-b5de-b2b7fc19fbb5" (UID: "138cec68-511a-467f-b5de-b2b7fc19fbb5"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:10:53.845636 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.845487 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-kube-api-access-hk7dk" (OuterVolumeSpecName: "kube-api-access-hk7dk") pod "138cec68-511a-467f-b5de-b2b7fc19fbb5" (UID: "138cec68-511a-467f-b5de-b2b7fc19fbb5"). InnerVolumeSpecName "kube-api-access-hk7dk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:10:53.845636 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.845510 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/138cec68-511a-467f-b5de-b2b7fc19fbb5-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "138cec68-511a-467f-b5de-b2b7fc19fbb5" (UID: "138cec68-511a-467f-b5de-b2b7fc19fbb5"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:10:53.845636 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.845614 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "138cec68-511a-467f-b5de-b2b7fc19fbb5" (UID: "138cec68-511a-467f-b5de-b2b7fc19fbb5"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:10:53.851858 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.851832 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/138cec68-511a-467f-b5de-b2b7fc19fbb5-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "138cec68-511a-467f-b5de-b2b7fc19fbb5" (UID: "138cec68-511a-467f-b5de-b2b7fc19fbb5"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:10:53.944377 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.944345 2568 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-tls\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:10:53.944377 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.944376 2568 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/138cec68-511a-467f-b5de-b2b7fc19fbb5-image-registry-private-configuration\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:10:53.944528 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.944386 2568 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/138cec68-511a-467f-b5de-b2b7fc19fbb5-installation-pull-secrets\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:10:53.944528 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.944396 2568 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/138cec68-511a-467f-b5de-b2b7fc19fbb5-ca-trust-extracted\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:10:53.944528 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.944405 2568 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-bound-sa-token\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:10:53.944528 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.944414 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hk7dk\" (UniqueName: \"kubernetes.io/projected/138cec68-511a-467f-b5de-b2b7fc19fbb5-kube-api-access-hk7dk\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:10:53.944528 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.944422 2568 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/138cec68-511a-467f-b5de-b2b7fc19fbb5-registry-certificates\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:10:53.944528 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:53.944430 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/138cec68-511a-467f-b5de-b2b7fc19fbb5-trusted-ca\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:10:54.515636 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:54.515602 2568 generic.go:358] "Generic (PLEG): container finished" podID="138cec68-511a-467f-b5de-b2b7fc19fbb5" containerID="9085f93c8575f4461dc1b461b4bd4704acd83f37dcb804c390afe0161d7752e0" exitCode=0 Apr 17 17:10:54.516017 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:54.515678 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" Apr 17 17:10:54.516017 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:54.515694 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" event={"ID":"138cec68-511a-467f-b5de-b2b7fc19fbb5","Type":"ContainerDied","Data":"9085f93c8575f4461dc1b461b4bd4704acd83f37dcb804c390afe0161d7752e0"} Apr 17 17:10:54.516017 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:54.515740 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f797f5f96-zsrx2" event={"ID":"138cec68-511a-467f-b5de-b2b7fc19fbb5","Type":"ContainerDied","Data":"f2bcc94e872babd69c02f440bf68b4650d5e9b096e421607415b3a74ee389c2e"} Apr 17 17:10:54.516017 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:54.515757 2568 scope.go:117] "RemoveContainer" containerID="9085f93c8575f4461dc1b461b4bd4704acd83f37dcb804c390afe0161d7752e0" Apr 17 17:10:54.523795 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:54.523781 2568 scope.go:117] "RemoveContainer" containerID="9085f93c8575f4461dc1b461b4bd4704acd83f37dcb804c390afe0161d7752e0" Apr 17 17:10:54.524014 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:10:54.523995 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9085f93c8575f4461dc1b461b4bd4704acd83f37dcb804c390afe0161d7752e0\": container with ID starting with 9085f93c8575f4461dc1b461b4bd4704acd83f37dcb804c390afe0161d7752e0 not found: ID does not exist" containerID="9085f93c8575f4461dc1b461b4bd4704acd83f37dcb804c390afe0161d7752e0" Apr 17 17:10:54.524063 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:54.524021 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9085f93c8575f4461dc1b461b4bd4704acd83f37dcb804c390afe0161d7752e0"} err="failed to get container status \"9085f93c8575f4461dc1b461b4bd4704acd83f37dcb804c390afe0161d7752e0\": rpc error: code = NotFound desc = could not find container \"9085f93c8575f4461dc1b461b4bd4704acd83f37dcb804c390afe0161d7752e0\": container with ID starting with 9085f93c8575f4461dc1b461b4bd4704acd83f37dcb804c390afe0161d7752e0 not found: ID does not exist" Apr 17 17:10:54.536199 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:54.536178 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5f797f5f96-zsrx2"] Apr 17 17:10:54.539626 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:54.539601 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5f797f5f96-zsrx2"] Apr 17 17:10:54.904290 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:10:54.904259 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="138cec68-511a-467f-b5de-b2b7fc19fbb5" path="/var/lib/kubelet/pods/138cec68-511a-467f-b5de-b2b7fc19fbb5/volumes" Apr 17 17:11:00.155677 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:11:00.155640 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" podUID="76642cd9-fee5-4031-a695-f531ec9d1dcc" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 17:11:10.156541 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:11:10.156502 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" podUID="76642cd9-fee5-4031-a695-f531ec9d1dcc" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 17:11:10.157018 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:11:10.156584 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" Apr 17 17:11:10.157060 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:11:10.157028 2568 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"00318608a753b97920e62a039e04b6dca8a1e13902473f06224e9718a9617a9d"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 17:11:10.157094 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:11:10.157061 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" podUID="76642cd9-fee5-4031-a695-f531ec9d1dcc" containerName="service-proxy" containerID="cri-o://00318608a753b97920e62a039e04b6dca8a1e13902473f06224e9718a9617a9d" gracePeriod=30 Apr 17 17:11:10.557094 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:11:10.557065 2568 generic.go:358] "Generic (PLEG): container finished" podID="76642cd9-fee5-4031-a695-f531ec9d1dcc" containerID="00318608a753b97920e62a039e04b6dca8a1e13902473f06224e9718a9617a9d" exitCode=2 Apr 17 17:11:10.557254 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:11:10.557142 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" event={"ID":"76642cd9-fee5-4031-a695-f531ec9d1dcc","Type":"ContainerDied","Data":"00318608a753b97920e62a039e04b6dca8a1e13902473f06224e9718a9617a9d"} Apr 17 17:11:10.557254 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:11:10.557176 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bdd67b75d-q49dd" event={"ID":"76642cd9-fee5-4031-a695-f531ec9d1dcc","Type":"ContainerStarted","Data":"1d1c01e36e02602a8d5289a4b787f6bf7fcb500433c6dac4bbf25acf454de27c"} Apr 17 17:11:38.769953 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:11:38.769866 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs\") pod \"network-metrics-daemon-4xcb9\" (UID: \"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89\") " pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:11:38.772110 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:11:38.772078 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89-metrics-certs\") pod \"network-metrics-daemon-4xcb9\" (UID: \"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89\") " pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:11:39.004883 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:11:39.004855 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mrqrv\"" Apr 17 17:11:39.012968 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:11:39.012946 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4xcb9" Apr 17 17:11:39.127575 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:11:39.127544 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4xcb9"] Apr 17 17:11:39.130418 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:11:39.130392 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62e98ec0_c09a_4b6f_8e6e_4f9c9896ab89.slice/crio-412d5be064ac4a7fd543010731b77a1ff1accd83b6a8196d008132cfdc67224d WatchSource:0}: Error finding container 412d5be064ac4a7fd543010731b77a1ff1accd83b6a8196d008132cfdc67224d: Status 404 returned error can't find the container with id 412d5be064ac4a7fd543010731b77a1ff1accd83b6a8196d008132cfdc67224d Apr 17 17:11:39.625872 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:11:39.625840 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4xcb9" event={"ID":"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89","Type":"ContainerStarted","Data":"412d5be064ac4a7fd543010731b77a1ff1accd83b6a8196d008132cfdc67224d"} Apr 17 17:11:40.629522 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:11:40.629444 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4xcb9" event={"ID":"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89","Type":"ContainerStarted","Data":"9d11d9a62059a2d2476b6557b0015ab2c16846b0e3099b38e3e728dfe91cc17a"} Apr 17 17:11:40.629522 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:11:40.629481 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4xcb9" event={"ID":"62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89","Type":"ContainerStarted","Data":"9b89ed06f71210c541cc3c00da20ea66bd458486874eddfe7bf424d7e3db82ed"} Apr 17 17:11:40.645566 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:11:40.645516 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4xcb9" podStartSLOduration=252.502186848 podStartE2EDuration="4m13.645498966s" podCreationTimestamp="2026-04-17 17:07:27 +0000 UTC" firstStartedPulling="2026-04-17 17:11:39.132139583 +0000 UTC m=+252.748272823" lastFinishedPulling="2026-04-17 17:11:40.275451697 +0000 UTC m=+253.891584941" observedRunningTime="2026-04-17 17:11:40.645146189 +0000 UTC m=+254.261279453" watchObservedRunningTime="2026-04-17 17:11:40.645498966 +0000 UTC m=+254.261632230" Apr 17 17:12:26.798179 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:12:26.798148 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:12:26.798796 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:12:26.798588 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:12:26.801088 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:12:26.801069 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:14:20.386614 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.386575 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-fqf5h"] Apr 17 17:14:20.387047 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.386814 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="138cec68-511a-467f-b5de-b2b7fc19fbb5" containerName="registry" Apr 17 17:14:20.387047 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.386825 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="138cec68-511a-467f-b5de-b2b7fc19fbb5" containerName="registry" Apr 17 17:14:20.387047 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.386871 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="138cec68-511a-467f-b5de-b2b7fc19fbb5" containerName="registry" Apr 17 17:14:20.388528 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.388512 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-fqf5h" Apr 17 17:14:20.391175 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.391157 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 17:14:20.392325 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.392289 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-bjx27\"" Apr 17 17:14:20.392442 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.392339 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 17:14:20.397241 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.397219 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-fqf5h"] Apr 17 17:14:20.401669 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.401651 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db210abc-2b72-473d-970b-bee83c698deb-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-fqf5h\" (UID: \"db210abc-2b72-473d-970b-bee83c698deb\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fqf5h" Apr 17 17:14:20.401759 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.401718 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlnh9\" (UniqueName: \"kubernetes.io/projected/db210abc-2b72-473d-970b-bee83c698deb-kube-api-access-mlnh9\") pod \"cert-manager-webhook-597b96b99b-fqf5h\" (UID: \"db210abc-2b72-473d-970b-bee83c698deb\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fqf5h" Apr 17 17:14:20.502991 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.502957 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db210abc-2b72-473d-970b-bee83c698deb-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-fqf5h\" (UID: \"db210abc-2b72-473d-970b-bee83c698deb\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fqf5h" Apr 17 17:14:20.503159 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.502999 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlnh9\" (UniqueName: \"kubernetes.io/projected/db210abc-2b72-473d-970b-bee83c698deb-kube-api-access-mlnh9\") pod \"cert-manager-webhook-597b96b99b-fqf5h\" (UID: \"db210abc-2b72-473d-970b-bee83c698deb\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fqf5h" Apr 17 17:14:20.510956 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.510931 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db210abc-2b72-473d-970b-bee83c698deb-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-fqf5h\" (UID: \"db210abc-2b72-473d-970b-bee83c698deb\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fqf5h" Apr 17 17:14:20.511068 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.511033 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlnh9\" (UniqueName: \"kubernetes.io/projected/db210abc-2b72-473d-970b-bee83c698deb-kube-api-access-mlnh9\") pod \"cert-manager-webhook-597b96b99b-fqf5h\" (UID: \"db210abc-2b72-473d-970b-bee83c698deb\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fqf5h" Apr 17 17:14:20.697488 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.697399 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-fqf5h" Apr 17 17:14:20.809193 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.809161 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-fqf5h"] Apr 17 17:14:20.812240 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:14:20.812202 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb210abc_2b72_473d_970b_bee83c698deb.slice/crio-ae08781a2fe1151b28755cd9bf313f58b5f68bc87c411ab0d2bd948595084676 WatchSource:0}: Error finding container ae08781a2fe1151b28755cd9bf313f58b5f68bc87c411ab0d2bd948595084676: Status 404 returned error can't find the container with id ae08781a2fe1151b28755cd9bf313f58b5f68bc87c411ab0d2bd948595084676 Apr 17 17:14:20.814121 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:20.814105 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:14:21.020074 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:21.020040 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-fqf5h" event={"ID":"db210abc-2b72-473d-970b-bee83c698deb","Type":"ContainerStarted","Data":"ae08781a2fe1151b28755cd9bf313f58b5f68bc87c411ab0d2bd948595084676"} Apr 17 17:14:25.031171 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:25.031132 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-fqf5h" event={"ID":"db210abc-2b72-473d-970b-bee83c698deb","Type":"ContainerStarted","Data":"97371d8d8f0fc2c3c4cb3be66eba019c1f6fd8433e82708284b36a799c8f1429"} Apr 17 17:14:25.031573 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:25.031285 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-fqf5h" Apr 17 17:14:25.047951 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:25.047892 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-fqf5h" podStartSLOduration=1.681109854 podStartE2EDuration="5.047878647s" podCreationTimestamp="2026-04-17 17:14:20 +0000 UTC" firstStartedPulling="2026-04-17 17:14:20.814233441 +0000 UTC m=+414.430366686" lastFinishedPulling="2026-04-17 17:14:24.181002223 +0000 UTC m=+417.797135479" observedRunningTime="2026-04-17 17:14:25.046979183 +0000 UTC m=+418.663112446" watchObservedRunningTime="2026-04-17 17:14:25.047878647 +0000 UTC m=+418.664011910" Apr 17 17:14:31.036022 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:14:31.035993 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-fqf5h" Apr 17 17:15:03.085574 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.085488 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5448568df4-j57ld"] Apr 17 17:15:03.092829 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.092808 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" Apr 17 17:15:03.097201 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.097179 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 17:15:03.097346 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.097178 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 17:15:03.097346 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.097230 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-r66kf\"" Apr 17 17:15:03.097346 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.097255 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 17:15:03.097346 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.097187 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:15:03.097346 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.097177 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 17:15:03.102608 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.102588 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5448568df4-j57ld"] Apr 17 17:15:03.215457 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.215426 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/648b1e6b-1be5-42e9-aa28-770c26cc4bb9-manager-config\") pod \"lws-controller-manager-5448568df4-j57ld\" (UID: \"648b1e6b-1be5-42e9-aa28-770c26cc4bb9\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" Apr 17 17:15:03.215457 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.215466 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/648b1e6b-1be5-42e9-aa28-770c26cc4bb9-cert\") pod \"lws-controller-manager-5448568df4-j57ld\" (UID: \"648b1e6b-1be5-42e9-aa28-770c26cc4bb9\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" Apr 17 17:15:03.215698 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.215490 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/648b1e6b-1be5-42e9-aa28-770c26cc4bb9-metrics-cert\") pod \"lws-controller-manager-5448568df4-j57ld\" (UID: \"648b1e6b-1be5-42e9-aa28-770c26cc4bb9\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" Apr 17 17:15:03.215698 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.215573 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m448\" (UniqueName: \"kubernetes.io/projected/648b1e6b-1be5-42e9-aa28-770c26cc4bb9-kube-api-access-5m448\") pod \"lws-controller-manager-5448568df4-j57ld\" (UID: \"648b1e6b-1be5-42e9-aa28-770c26cc4bb9\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" Apr 17 17:15:03.316289 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.316255 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/648b1e6b-1be5-42e9-aa28-770c26cc4bb9-cert\") pod \"lws-controller-manager-5448568df4-j57ld\" (UID: \"648b1e6b-1be5-42e9-aa28-770c26cc4bb9\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" Apr 17 17:15:03.316448 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.316295 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/648b1e6b-1be5-42e9-aa28-770c26cc4bb9-metrics-cert\") pod \"lws-controller-manager-5448568df4-j57ld\" (UID: \"648b1e6b-1be5-42e9-aa28-770c26cc4bb9\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" Apr 17 17:15:03.316448 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.316353 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m448\" (UniqueName: \"kubernetes.io/projected/648b1e6b-1be5-42e9-aa28-770c26cc4bb9-kube-api-access-5m448\") pod \"lws-controller-manager-5448568df4-j57ld\" (UID: \"648b1e6b-1be5-42e9-aa28-770c26cc4bb9\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" Apr 17 17:15:03.316448 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.316382 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/648b1e6b-1be5-42e9-aa28-770c26cc4bb9-manager-config\") pod \"lws-controller-manager-5448568df4-j57ld\" (UID: \"648b1e6b-1be5-42e9-aa28-770c26cc4bb9\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" Apr 17 17:15:03.317068 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.317045 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/648b1e6b-1be5-42e9-aa28-770c26cc4bb9-manager-config\") pod \"lws-controller-manager-5448568df4-j57ld\" (UID: \"648b1e6b-1be5-42e9-aa28-770c26cc4bb9\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" Apr 17 17:15:03.318710 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.318692 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/648b1e6b-1be5-42e9-aa28-770c26cc4bb9-cert\") pod \"lws-controller-manager-5448568df4-j57ld\" (UID: \"648b1e6b-1be5-42e9-aa28-770c26cc4bb9\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" Apr 17 17:15:03.318786 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.318744 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/648b1e6b-1be5-42e9-aa28-770c26cc4bb9-metrics-cert\") pod \"lws-controller-manager-5448568df4-j57ld\" (UID: \"648b1e6b-1be5-42e9-aa28-770c26cc4bb9\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" Apr 17 17:15:03.326047 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.326025 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m448\" (UniqueName: \"kubernetes.io/projected/648b1e6b-1be5-42e9-aa28-770c26cc4bb9-kube-api-access-5m448\") pod \"lws-controller-manager-5448568df4-j57ld\" (UID: \"648b1e6b-1be5-42e9-aa28-770c26cc4bb9\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" Apr 17 17:15:03.402102 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.402027 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" Apr 17 17:15:03.523041 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:03.523013 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5448568df4-j57ld"] Apr 17 17:15:03.525337 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:15:03.525296 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod648b1e6b_1be5_42e9_aa28_770c26cc4bb9.slice/crio-4f34097746cb13face7ff13e7769053d1fa72e60601dd0bf9263af62405d5667 WatchSource:0}: Error finding container 4f34097746cb13face7ff13e7769053d1fa72e60601dd0bf9263af62405d5667: Status 404 returned error can't find the container with id 4f34097746cb13face7ff13e7769053d1fa72e60601dd0bf9263af62405d5667 Apr 17 17:15:04.128320 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:04.128269 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" event={"ID":"648b1e6b-1be5-42e9-aa28-770c26cc4bb9","Type":"ContainerStarted","Data":"4f34097746cb13face7ff13e7769053d1fa72e60601dd0bf9263af62405d5667"} Apr 17 17:15:09.143558 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.143525 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" event={"ID":"648b1e6b-1be5-42e9-aa28-770c26cc4bb9","Type":"ContainerStarted","Data":"81f2b9ad0f7cba1fc8932cacee508c1f85de9b0adb36943f660612248b0521a4"} Apr 17 17:15:09.144011 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.143640 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" Apr 17 17:15:09.169933 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.169890 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" podStartSLOduration=1.5547692990000002 podStartE2EDuration="6.169876232s" podCreationTimestamp="2026-04-17 17:15:03 +0000 UTC" firstStartedPulling="2026-04-17 17:15:03.527127936 +0000 UTC m=+457.143261177" lastFinishedPulling="2026-04-17 17:15:08.142234869 +0000 UTC m=+461.758368110" observedRunningTime="2026-04-17 17:15:09.168993584 +0000 UTC m=+462.785126846" watchObservedRunningTime="2026-04-17 17:15:09.169876232 +0000 UTC m=+462.786009494" Apr 17 17:15:09.573443 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.573409 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj"] Apr 17 17:15:09.576410 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.576395 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj" Apr 17 17:15:09.580776 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.580755 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 17:15:09.580890 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.580790 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-m57m4\"" Apr 17 17:15:09.581050 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.581037 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 17:15:09.581097 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.581064 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 17:15:09.581500 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.581486 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 17:15:09.607491 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.607452 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj"] Apr 17 17:15:09.666182 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.666149 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhjz9\" (UniqueName: \"kubernetes.io/projected/1bfd898b-d4fd-435e-9577-faa2d70a9933-kube-api-access-fhjz9\") pod \"opendatahub-operator-controller-manager-6569445fb5-n5xbj\" (UID: \"1bfd898b-d4fd-435e-9577-faa2d70a9933\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj" Apr 17 17:15:09.666352 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.666200 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1bfd898b-d4fd-435e-9577-faa2d70a9933-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-n5xbj\" (UID: \"1bfd898b-d4fd-435e-9577-faa2d70a9933\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj" Apr 17 17:15:09.666352 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.666273 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1bfd898b-d4fd-435e-9577-faa2d70a9933-webhook-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-n5xbj\" (UID: \"1bfd898b-d4fd-435e-9577-faa2d70a9933\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj" Apr 17 17:15:09.767320 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.767270 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1bfd898b-d4fd-435e-9577-faa2d70a9933-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-n5xbj\" (UID: \"1bfd898b-d4fd-435e-9577-faa2d70a9933\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj" Apr 17 17:15:09.767482 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.767350 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1bfd898b-d4fd-435e-9577-faa2d70a9933-webhook-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-n5xbj\" (UID: \"1bfd898b-d4fd-435e-9577-faa2d70a9933\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj" Apr 17 17:15:09.767482 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.767397 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhjz9\" (UniqueName: \"kubernetes.io/projected/1bfd898b-d4fd-435e-9577-faa2d70a9933-kube-api-access-fhjz9\") pod \"opendatahub-operator-controller-manager-6569445fb5-n5xbj\" (UID: \"1bfd898b-d4fd-435e-9577-faa2d70a9933\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj" Apr 17 17:15:09.769905 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.769874 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1bfd898b-d4fd-435e-9577-faa2d70a9933-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-n5xbj\" (UID: \"1bfd898b-d4fd-435e-9577-faa2d70a9933\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj" Apr 17 17:15:09.770015 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.769920 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1bfd898b-d4fd-435e-9577-faa2d70a9933-webhook-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-n5xbj\" (UID: \"1bfd898b-d4fd-435e-9577-faa2d70a9933\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj" Apr 17 17:15:09.777751 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.777723 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhjz9\" (UniqueName: \"kubernetes.io/projected/1bfd898b-d4fd-435e-9577-faa2d70a9933-kube-api-access-fhjz9\") pod \"opendatahub-operator-controller-manager-6569445fb5-n5xbj\" (UID: \"1bfd898b-d4fd-435e-9577-faa2d70a9933\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj" Apr 17 17:15:09.887716 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:09.887643 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj" Apr 17 17:15:10.006097 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:10.006068 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj"] Apr 17 17:15:10.009129 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:15:10.009095 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bfd898b_d4fd_435e_9577_faa2d70a9933.slice/crio-252f7565c965b8fd95997b91c4e99df2d963563fb64573948cf95e673b6b35f7 WatchSource:0}: Error finding container 252f7565c965b8fd95997b91c4e99df2d963563fb64573948cf95e673b6b35f7: Status 404 returned error can't find the container with id 252f7565c965b8fd95997b91c4e99df2d963563fb64573948cf95e673b6b35f7 Apr 17 17:15:10.147834 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:10.147739 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj" event={"ID":"1bfd898b-d4fd-435e-9577-faa2d70a9933","Type":"ContainerStarted","Data":"252f7565c965b8fd95997b91c4e99df2d963563fb64573948cf95e673b6b35f7"} Apr 17 17:15:13.158525 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:13.158491 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj" event={"ID":"1bfd898b-d4fd-435e-9577-faa2d70a9933","Type":"ContainerStarted","Data":"4901c3d2a52e9d5a2b687c0ae7bfe08d224ab33473ad1f688520239fe8540de7"} Apr 17 17:15:13.158905 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:13.158635 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj" Apr 17 17:15:13.189319 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:13.189263 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj" podStartSLOduration=1.7544851609999998 podStartE2EDuration="4.189250811s" podCreationTimestamp="2026-04-17 17:15:09 +0000 UTC" firstStartedPulling="2026-04-17 17:15:10.010823163 +0000 UTC m=+463.626956404" lastFinishedPulling="2026-04-17 17:15:12.445588805 +0000 UTC m=+466.061722054" observedRunningTime="2026-04-17 17:15:13.188480061 +0000 UTC m=+466.804613324" watchObservedRunningTime="2026-04-17 17:15:13.189250811 +0000 UTC m=+466.805384115" Apr 17 17:15:20.149559 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:20.149528 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5448568df4-j57ld" Apr 17 17:15:24.163768 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:15:24.163731 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-n5xbj" Apr 17 17:16:17.199242 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.199208 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6"] Apr 17 17:16:17.201209 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.201188 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.204270 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.204250 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 17:16:17.204409 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.204336 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 17:16:17.204626 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.204609 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 17:16:17.205502 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.205486 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-8gpt2\"" Apr 17 17:16:17.215106 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.215085 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6"] Apr 17 17:16:17.242735 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.242696 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qf4h\" (UniqueName: \"kubernetes.io/projected/5a3739be-a28c-4dd1-a517-ec86595a6822-kube-api-access-7qf4h\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.242884 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.242777 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5a3739be-a28c-4dd1-a517-ec86595a6822-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.242884 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.242809 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5a3739be-a28c-4dd1-a517-ec86595a6822-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.242884 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.242843 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5a3739be-a28c-4dd1-a517-ec86595a6822-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.243039 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.242897 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5a3739be-a28c-4dd1-a517-ec86595a6822-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.243039 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.242930 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5a3739be-a28c-4dd1-a517-ec86595a6822-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.243039 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.242952 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5a3739be-a28c-4dd1-a517-ec86595a6822-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.243039 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.242970 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5a3739be-a28c-4dd1-a517-ec86595a6822-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.243039 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.242991 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5a3739be-a28c-4dd1-a517-ec86595a6822-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.343768 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.343732 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5a3739be-a28c-4dd1-a517-ec86595a6822-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.343768 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.343766 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5a3739be-a28c-4dd1-a517-ec86595a6822-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.343972 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.343786 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5a3739be-a28c-4dd1-a517-ec86595a6822-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.343972 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.343908 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qf4h\" (UniqueName: \"kubernetes.io/projected/5a3739be-a28c-4dd1-a517-ec86595a6822-kube-api-access-7qf4h\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.344068 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.344035 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5a3739be-a28c-4dd1-a517-ec86595a6822-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.344119 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.344071 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5a3739be-a28c-4dd1-a517-ec86595a6822-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.344170 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.344113 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5a3739be-a28c-4dd1-a517-ec86595a6822-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.344170 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.344146 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5a3739be-a28c-4dd1-a517-ec86595a6822-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.344265 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.344171 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5a3739be-a28c-4dd1-a517-ec86595a6822-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.344265 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.344228 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/5a3739be-a28c-4dd1-a517-ec86595a6822-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.344750 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.344485 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/5a3739be-a28c-4dd1-a517-ec86595a6822-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.344750 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.344539 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/5a3739be-a28c-4dd1-a517-ec86595a6822-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.344750 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.344547 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/5a3739be-a28c-4dd1-a517-ec86595a6822-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.344750 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.344635 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/5a3739be-a28c-4dd1-a517-ec86595a6822-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.346767 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.346743 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5a3739be-a28c-4dd1-a517-ec86595a6822-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.346971 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.346952 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/5a3739be-a28c-4dd1-a517-ec86595a6822-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.354865 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.354840 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5a3739be-a28c-4dd1-a517-ec86595a6822-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.355004 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.354985 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qf4h\" (UniqueName: \"kubernetes.io/projected/5a3739be-a28c-4dd1-a517-ec86595a6822-kube-api-access-7qf4h\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6bcz6\" (UID: \"5a3739be-a28c-4dd1-a517-ec86595a6822\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.516407 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.516315 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:17.636162 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:17.636131 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6"] Apr 17 17:16:17.640054 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:16:17.640028 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a3739be_a28c_4dd1_a517_ec86595a6822.slice/crio-386256a633a355722680cc316acd1ff8ba287a40582e45f43e2922ad77485c6a WatchSource:0}: Error finding container 386256a633a355722680cc316acd1ff8ba287a40582e45f43e2922ad77485c6a: Status 404 returned error can't find the container with id 386256a633a355722680cc316acd1ff8ba287a40582e45f43e2922ad77485c6a Apr 17 17:16:18.325363 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:18.325328 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" event={"ID":"5a3739be-a28c-4dd1-a517-ec86595a6822","Type":"ContainerStarted","Data":"386256a633a355722680cc316acd1ff8ba287a40582e45f43e2922ad77485c6a"} Apr 17 17:16:20.469874 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:20.469833 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 17:16:20.470111 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:20.469949 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 17:16:20.470111 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:20.469982 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 17:16:21.335066 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:21.335031 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" event={"ID":"5a3739be-a28c-4dd1-a517-ec86595a6822","Type":"ContainerStarted","Data":"34b0ac9a098827f86d874ed718a2202e42156926faecc77bec4b195addf23948"} Apr 17 17:16:21.354746 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:21.354698 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" podStartSLOduration=1.526931585 podStartE2EDuration="4.354686051s" podCreationTimestamp="2026-04-17 17:16:17 +0000 UTC" firstStartedPulling="2026-04-17 17:16:17.641839164 +0000 UTC m=+531.257972405" lastFinishedPulling="2026-04-17 17:16:20.46959363 +0000 UTC m=+534.085726871" observedRunningTime="2026-04-17 17:16:21.353831631 +0000 UTC m=+534.969964893" watchObservedRunningTime="2026-04-17 17:16:21.354686051 +0000 UTC m=+534.970819314" Apr 17 17:16:21.516814 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:21.516783 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:21.521170 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:21.521148 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:22.338088 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:22.338061 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:22.339145 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:22.339117 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6bcz6" Apr 17 17:16:27.282221 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:27.282182 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bbhzb"] Apr 17 17:16:27.284436 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:27.284416 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bbhzb" Apr 17 17:16:27.287210 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:27.287187 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 17:16:27.287210 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:27.287204 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 17:16:27.287365 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:27.287192 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-ntkr5\"" Apr 17 17:16:27.293612 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:27.293595 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bbhzb"] Apr 17 17:16:27.421722 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:27.421695 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcvfp\" (UniqueName: \"kubernetes.io/projected/d960ab89-1946-42bc-88b4-1e589e03a8c1-kube-api-access-vcvfp\") pod \"kuadrant-operator-catalog-bbhzb\" (UID: \"d960ab89-1946-42bc-88b4-1e589e03a8c1\") " pod="kuadrant-system/kuadrant-operator-catalog-bbhzb" Apr 17 17:16:27.522128 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:27.522096 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcvfp\" (UniqueName: \"kubernetes.io/projected/d960ab89-1946-42bc-88b4-1e589e03a8c1-kube-api-access-vcvfp\") pod \"kuadrant-operator-catalog-bbhzb\" (UID: \"d960ab89-1946-42bc-88b4-1e589e03a8c1\") " pod="kuadrant-system/kuadrant-operator-catalog-bbhzb" Apr 17 17:16:27.532008 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:27.531980 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcvfp\" (UniqueName: \"kubernetes.io/projected/d960ab89-1946-42bc-88b4-1e589e03a8c1-kube-api-access-vcvfp\") pod \"kuadrant-operator-catalog-bbhzb\" (UID: \"d960ab89-1946-42bc-88b4-1e589e03a8c1\") " pod="kuadrant-system/kuadrant-operator-catalog-bbhzb" Apr 17 17:16:27.594017 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:27.593954 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bbhzb" Apr 17 17:16:27.705691 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:27.705653 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bbhzb"] Apr 17 17:16:27.708643 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:16:27.708602 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd960ab89_1946_42bc_88b4_1e589e03a8c1.slice/crio-a1d936a758681e5ad9d5c58c61c58d876f43f3876f63bfd1e0d4877d208784a9 WatchSource:0}: Error finding container a1d936a758681e5ad9d5c58c61c58d876f43f3876f63bfd1e0d4877d208784a9: Status 404 returned error can't find the container with id a1d936a758681e5ad9d5c58c61c58d876f43f3876f63bfd1e0d4877d208784a9 Apr 17 17:16:28.356284 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:28.356245 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bbhzb" event={"ID":"d960ab89-1946-42bc-88b4-1e589e03a8c1","Type":"ContainerStarted","Data":"a1d936a758681e5ad9d5c58c61c58d876f43f3876f63bfd1e0d4877d208784a9"} Apr 17 17:16:30.364229 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:30.364194 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bbhzb" event={"ID":"d960ab89-1946-42bc-88b4-1e589e03a8c1","Type":"ContainerStarted","Data":"cfc9d1e60b9812251409d4d02daa59a862fbf53661ca3edea8f7f150b80861fb"} Apr 17 17:16:30.379650 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:30.379586 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-bbhzb" podStartSLOduration=1.3957770360000001 podStartE2EDuration="3.379572389s" podCreationTimestamp="2026-04-17 17:16:27 +0000 UTC" firstStartedPulling="2026-04-17 17:16:27.709982634 +0000 UTC m=+541.326115874" lastFinishedPulling="2026-04-17 17:16:29.693777986 +0000 UTC m=+543.309911227" observedRunningTime="2026-04-17 17:16:30.378648712 +0000 UTC m=+543.994781979" watchObservedRunningTime="2026-04-17 17:16:30.379572389 +0000 UTC m=+543.995705651" Apr 17 17:16:37.594942 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:37.594912 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-bbhzb" Apr 17 17:16:37.594942 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:37.594948 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-bbhzb" Apr 17 17:16:37.615853 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:37.615825 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-bbhzb" Apr 17 17:16:38.409545 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:38.409521 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-bbhzb" Apr 17 17:16:59.520345 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:59.520299 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp"] Apr 17 17:16:59.526472 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:59.526454 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" Apr 17 17:16:59.529508 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:59.529485 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-7gv2w\"" Apr 17 17:16:59.534011 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:59.533992 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp"] Apr 17 17:16:59.655551 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:59.655521 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt6cn\" (UniqueName: \"kubernetes.io/projected/222b2831-30d2-4a15-879b-b2e3fb94eff3-kube-api-access-wt6cn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-79lqp\" (UID: \"222b2831-30d2-4a15-879b-b2e3fb94eff3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" Apr 17 17:16:59.655696 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:59.655557 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/222b2831-30d2-4a15-879b-b2e3fb94eff3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-79lqp\" (UID: \"222b2831-30d2-4a15-879b-b2e3fb94eff3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" Apr 17 17:16:59.756137 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:59.756104 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wt6cn\" (UniqueName: \"kubernetes.io/projected/222b2831-30d2-4a15-879b-b2e3fb94eff3-kube-api-access-wt6cn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-79lqp\" (UID: \"222b2831-30d2-4a15-879b-b2e3fb94eff3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" Apr 17 17:16:59.756339 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:59.756144 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/222b2831-30d2-4a15-879b-b2e3fb94eff3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-79lqp\" (UID: \"222b2831-30d2-4a15-879b-b2e3fb94eff3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" Apr 17 17:16:59.756550 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:59.756531 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/222b2831-30d2-4a15-879b-b2e3fb94eff3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-79lqp\" (UID: \"222b2831-30d2-4a15-879b-b2e3fb94eff3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" Apr 17 17:16:59.772325 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:59.772240 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt6cn\" (UniqueName: \"kubernetes.io/projected/222b2831-30d2-4a15-879b-b2e3fb94eff3-kube-api-access-wt6cn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-79lqp\" (UID: \"222b2831-30d2-4a15-879b-b2e3fb94eff3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" Apr 17 17:16:59.837100 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:59.837074 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" Apr 17 17:16:59.966123 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:16:59.966101 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp"] Apr 17 17:16:59.968351 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:16:59.968323 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod222b2831_30d2_4a15_879b_b2e3fb94eff3.slice/crio-32e1c9f6df3673aedfc066c79d065517d262e1c2eecb299f38ade46b5a6e6d0c WatchSource:0}: Error finding container 32e1c9f6df3673aedfc066c79d065517d262e1c2eecb299f38ade46b5a6e6d0c: Status 404 returned error can't find the container with id 32e1c9f6df3673aedfc066c79d065517d262e1c2eecb299f38ade46b5a6e6d0c Apr 17 17:17:00.452427 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:00.452383 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" event={"ID":"222b2831-30d2-4a15-879b-b2e3fb94eff3","Type":"ContainerStarted","Data":"32e1c9f6df3673aedfc066c79d065517d262e1c2eecb299f38ade46b5a6e6d0c"} Apr 17 17:17:04.482280 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:04.482242 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-t5j8c"] Apr 17 17:17:04.484451 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:04.484431 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-t5j8c" Apr 17 17:17:04.487963 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:04.487932 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-f822q\"" Apr 17 17:17:04.507668 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:04.507635 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-t5j8c"] Apr 17 17:17:04.596205 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:04.596168 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfnvq\" (UniqueName: \"kubernetes.io/projected/1e23ac42-c082-4abc-9a28-357b80204db7-kube-api-access-wfnvq\") pod \"authorino-operator-657f44b778-t5j8c\" (UID: \"1e23ac42-c082-4abc-9a28-357b80204db7\") " pod="kuadrant-system/authorino-operator-657f44b778-t5j8c" Apr 17 17:17:04.697591 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:04.697553 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfnvq\" (UniqueName: \"kubernetes.io/projected/1e23ac42-c082-4abc-9a28-357b80204db7-kube-api-access-wfnvq\") pod \"authorino-operator-657f44b778-t5j8c\" (UID: \"1e23ac42-c082-4abc-9a28-357b80204db7\") " pod="kuadrant-system/authorino-operator-657f44b778-t5j8c" Apr 17 17:17:04.708464 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:04.708422 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfnvq\" (UniqueName: \"kubernetes.io/projected/1e23ac42-c082-4abc-9a28-357b80204db7-kube-api-access-wfnvq\") pod \"authorino-operator-657f44b778-t5j8c\" (UID: \"1e23ac42-c082-4abc-9a28-357b80204db7\") " pod="kuadrant-system/authorino-operator-657f44b778-t5j8c" Apr 17 17:17:04.796672 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:04.796641 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-t5j8c" Apr 17 17:17:05.426055 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:05.426033 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-t5j8c"] Apr 17 17:17:05.428061 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:17:05.428035 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e23ac42_c082_4abc_9a28_357b80204db7.slice/crio-5aa836a3c7d4caa2f37f7c46b740595d0f49de610bddfb7ce3f3d998d5cd1e18 WatchSource:0}: Error finding container 5aa836a3c7d4caa2f37f7c46b740595d0f49de610bddfb7ce3f3d998d5cd1e18: Status 404 returned error can't find the container with id 5aa836a3c7d4caa2f37f7c46b740595d0f49de610bddfb7ce3f3d998d5cd1e18 Apr 17 17:17:05.470106 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:05.470079 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-t5j8c" event={"ID":"1e23ac42-c082-4abc-9a28-357b80204db7","Type":"ContainerStarted","Data":"5aa836a3c7d4caa2f37f7c46b740595d0f49de610bddfb7ce3f3d998d5cd1e18"} Apr 17 17:17:05.471384 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:05.471361 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" event={"ID":"222b2831-30d2-4a15-879b-b2e3fb94eff3","Type":"ContainerStarted","Data":"a01eebf6b66a55959ec9976d6ccbf8ff36aeaaf644b30be8644dd24c32165d77"} Apr 17 17:17:05.471559 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:05.471542 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" Apr 17 17:17:05.492156 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:05.492114 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" podStartSLOduration=1.126205429 podStartE2EDuration="6.492105132s" podCreationTimestamp="2026-04-17 17:16:59 +0000 UTC" firstStartedPulling="2026-04-17 17:16:59.970506225 +0000 UTC m=+573.586639467" lastFinishedPulling="2026-04-17 17:17:05.336405925 +0000 UTC m=+578.952539170" observedRunningTime="2026-04-17 17:17:05.490583886 +0000 UTC m=+579.106717149" watchObservedRunningTime="2026-04-17 17:17:05.492105132 +0000 UTC m=+579.108238392" Apr 17 17:17:08.483576 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:08.483539 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-t5j8c" event={"ID":"1e23ac42-c082-4abc-9a28-357b80204db7","Type":"ContainerStarted","Data":"0f844dc02c82a494ce8a71736544920c232a3c062c46fa39f5262b9bad3bda59"} Apr 17 17:17:08.483971 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:08.483655 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-t5j8c" Apr 17 17:17:08.509947 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:08.509905 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-t5j8c" podStartSLOduration=2.322428483 podStartE2EDuration="4.509893969s" podCreationTimestamp="2026-04-17 17:17:04 +0000 UTC" firstStartedPulling="2026-04-17 17:17:05.43004597 +0000 UTC m=+579.046179212" lastFinishedPulling="2026-04-17 17:17:07.617511454 +0000 UTC m=+581.233644698" observedRunningTime="2026-04-17 17:17:08.509100877 +0000 UTC m=+582.125234137" watchObservedRunningTime="2026-04-17 17:17:08.509893969 +0000 UTC m=+582.126027297" Apr 17 17:17:16.477036 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:16.477005 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" Apr 17 17:17:18.186059 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.186022 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp"] Apr 17 17:17:18.186677 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.186276 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" podUID="222b2831-30d2-4a15-879b-b2e3fb94eff3" containerName="manager" containerID="cri-o://a01eebf6b66a55959ec9976d6ccbf8ff36aeaaf644b30be8644dd24c32165d77" gracePeriod=2 Apr 17 17:17:18.188451 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.188409 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp"] Apr 17 17:17:18.189715 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.189681 2568 status_manager.go:895] "Failed to get status for pod" podUID="222b2831-30d2-4a15-879b-b2e3fb94eff3" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-79lqp\" is forbidden: User \"system:node:ip-10-0-134-244.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-244.ec2.internal' and this object" Apr 17 17:17:18.222664 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.222636 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd"] Apr 17 17:17:18.222905 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.222893 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="222b2831-30d2-4a15-879b-b2e3fb94eff3" containerName="manager" Apr 17 17:17:18.222975 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.222907 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="222b2831-30d2-4a15-879b-b2e3fb94eff3" containerName="manager" Apr 17 17:17:18.222975 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.222966 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="222b2831-30d2-4a15-879b-b2e3fb94eff3" containerName="manager" Apr 17 17:17:18.223631 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:17:18.223601 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod222b2831_30d2_4a15_879b_b2e3fb94eff3.slice/crio-a01eebf6b66a55959ec9976d6ccbf8ff36aeaaf644b30be8644dd24c32165d77.scope\": RecentStats: unable to find data in memory cache]" Apr 17 17:17:18.225747 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.225726 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" Apr 17 17:17:18.245296 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.245269 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd"] Apr 17 17:17:18.277225 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.277183 2568 status_manager.go:895] "Failed to get status for pod" podUID="222b2831-30d2-4a15-879b-b2e3fb94eff3" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-79lqp\" is forbidden: User \"system:node:ip-10-0-134-244.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-244.ec2.internal' and this object" Apr 17 17:17:18.397470 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.397446 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82fxn\" (UniqueName: \"kubernetes.io/projected/868da124-725c-4740-b2eb-3276ca689a81-kube-api-access-82fxn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gm4hd\" (UID: \"868da124-725c-4740-b2eb-3276ca689a81\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" Apr 17 17:17:18.397566 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.397479 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/868da124-725c-4740-b2eb-3276ca689a81-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gm4hd\" (UID: \"868da124-725c-4740-b2eb-3276ca689a81\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" Apr 17 17:17:18.411961 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.411941 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" Apr 17 17:17:18.414557 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.414532 2568 status_manager.go:895] "Failed to get status for pod" podUID="222b2831-30d2-4a15-879b-b2e3fb94eff3" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-79lqp\" is forbidden: User \"system:node:ip-10-0-134-244.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-244.ec2.internal' and this object" Apr 17 17:17:18.498788 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.498720 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82fxn\" (UniqueName: \"kubernetes.io/projected/868da124-725c-4740-b2eb-3276ca689a81-kube-api-access-82fxn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gm4hd\" (UID: \"868da124-725c-4740-b2eb-3276ca689a81\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" Apr 17 17:17:18.498788 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.498754 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/868da124-725c-4740-b2eb-3276ca689a81-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gm4hd\" (UID: \"868da124-725c-4740-b2eb-3276ca689a81\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" Apr 17 17:17:18.499060 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.499043 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/868da124-725c-4740-b2eb-3276ca689a81-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gm4hd\" (UID: \"868da124-725c-4740-b2eb-3276ca689a81\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" Apr 17 17:17:18.510050 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.510026 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82fxn\" (UniqueName: \"kubernetes.io/projected/868da124-725c-4740-b2eb-3276ca689a81-kube-api-access-82fxn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gm4hd\" (UID: \"868da124-725c-4740-b2eb-3276ca689a81\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" Apr 17 17:17:18.513100 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.513076 2568 generic.go:358] "Generic (PLEG): container finished" podID="222b2831-30d2-4a15-879b-b2e3fb94eff3" containerID="a01eebf6b66a55959ec9976d6ccbf8ff36aeaaf644b30be8644dd24c32165d77" exitCode=0 Apr 17 17:17:18.513202 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.513127 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" Apr 17 17:17:18.513202 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.513169 2568 scope.go:117] "RemoveContainer" containerID="a01eebf6b66a55959ec9976d6ccbf8ff36aeaaf644b30be8644dd24c32165d77" Apr 17 17:17:18.515544 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.515519 2568 status_manager.go:895] "Failed to get status for pod" podUID="222b2831-30d2-4a15-879b-b2e3fb94eff3" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-79lqp\" is forbidden: User \"system:node:ip-10-0-134-244.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-244.ec2.internal' and this object" Apr 17 17:17:18.520415 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.520397 2568 scope.go:117] "RemoveContainer" containerID="a01eebf6b66a55959ec9976d6ccbf8ff36aeaaf644b30be8644dd24c32165d77" Apr 17 17:17:18.520678 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:17:18.520659 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a01eebf6b66a55959ec9976d6ccbf8ff36aeaaf644b30be8644dd24c32165d77\": container with ID starting with a01eebf6b66a55959ec9976d6ccbf8ff36aeaaf644b30be8644dd24c32165d77 not found: ID does not exist" containerID="a01eebf6b66a55959ec9976d6ccbf8ff36aeaaf644b30be8644dd24c32165d77" Apr 17 17:17:18.520732 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.520690 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a01eebf6b66a55959ec9976d6ccbf8ff36aeaaf644b30be8644dd24c32165d77"} err="failed to get container status \"a01eebf6b66a55959ec9976d6ccbf8ff36aeaaf644b30be8644dd24c32165d77\": rpc error: code = NotFound desc = could not find container \"a01eebf6b66a55959ec9976d6ccbf8ff36aeaaf644b30be8644dd24c32165d77\": container with ID starting with a01eebf6b66a55959ec9976d6ccbf8ff36aeaaf644b30be8644dd24c32165d77 not found: ID does not exist" Apr 17 17:17:18.551849 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.551831 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" Apr 17 17:17:18.599993 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.599969 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/222b2831-30d2-4a15-879b-b2e3fb94eff3-extensions-socket-volume\") pod \"222b2831-30d2-4a15-879b-b2e3fb94eff3\" (UID: \"222b2831-30d2-4a15-879b-b2e3fb94eff3\") " Apr 17 17:17:18.600129 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.600011 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt6cn\" (UniqueName: \"kubernetes.io/projected/222b2831-30d2-4a15-879b-b2e3fb94eff3-kube-api-access-wt6cn\") pod \"222b2831-30d2-4a15-879b-b2e3fb94eff3\" (UID: \"222b2831-30d2-4a15-879b-b2e3fb94eff3\") " Apr 17 17:17:18.600769 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.600738 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/222b2831-30d2-4a15-879b-b2e3fb94eff3-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "222b2831-30d2-4a15-879b-b2e3fb94eff3" (UID: "222b2831-30d2-4a15-879b-b2e3fb94eff3"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:17:18.602204 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.602178 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/222b2831-30d2-4a15-879b-b2e3fb94eff3-kube-api-access-wt6cn" (OuterVolumeSpecName: "kube-api-access-wt6cn") pod "222b2831-30d2-4a15-879b-b2e3fb94eff3" (UID: "222b2831-30d2-4a15-879b-b2e3fb94eff3"). InnerVolumeSpecName "kube-api-access-wt6cn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:17:18.670613 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.670590 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd"] Apr 17 17:17:18.673130 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:17:18.673101 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod868da124_725c_4740_b2eb_3276ca689a81.slice/crio-f8b0794bd2c6f4b3eb25a51aef66c0a1c86459619c0825a547b28f7a77700647 WatchSource:0}: Error finding container f8b0794bd2c6f4b3eb25a51aef66c0a1c86459619c0825a547b28f7a77700647: Status 404 returned error can't find the container with id f8b0794bd2c6f4b3eb25a51aef66c0a1c86459619c0825a547b28f7a77700647 Apr 17 17:17:18.700822 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.700802 2568 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/222b2831-30d2-4a15-879b-b2e3fb94eff3-extensions-socket-volume\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:17:18.700904 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.700824 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wt6cn\" (UniqueName: \"kubernetes.io/projected/222b2831-30d2-4a15-879b-b2e3fb94eff3-kube-api-access-wt6cn\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:17:18.823322 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.823277 2568 status_manager.go:895] "Failed to get status for pod" podUID="222b2831-30d2-4a15-879b-b2e3fb94eff3" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-79lqp" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-79lqp\" is forbidden: User \"system:node:ip-10-0-134-244.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-244.ec2.internal' and this object" Apr 17 17:17:18.905769 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:18.905734 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="222b2831-30d2-4a15-879b-b2e3fb94eff3" path="/var/lib/kubelet/pods/222b2831-30d2-4a15-879b-b2e3fb94eff3/volumes" Apr 17 17:17:19.489152 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:19.489059 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-t5j8c" Apr 17 17:17:19.517959 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:19.517919 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" event={"ID":"868da124-725c-4740-b2eb-3276ca689a81","Type":"ContainerStarted","Data":"11b1d209f732aa267eb11d0f68b1577816f2c883af08e8af3f110c4f1ef5dadb"} Apr 17 17:17:19.517959 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:19.517960 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" event={"ID":"868da124-725c-4740-b2eb-3276ca689a81","Type":"ContainerStarted","Data":"f8b0794bd2c6f4b3eb25a51aef66c0a1c86459619c0825a547b28f7a77700647"} Apr 17 17:17:19.518153 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:19.517996 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" Apr 17 17:17:19.568803 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:19.568755 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" podStartSLOduration=1.568741655 podStartE2EDuration="1.568741655s" podCreationTimestamp="2026-04-17 17:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:17:19.568593447 +0000 UTC m=+593.184726710" watchObservedRunningTime="2026-04-17 17:17:19.568741655 +0000 UTC m=+593.184874918" Apr 17 17:17:26.820632 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:26.820611 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:17:26.821011 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:26.820683 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:17:30.522808 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:30.522781 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" Apr 17 17:17:34.040383 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.040350 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd"] Apr 17 17:17:34.040768 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.040616 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" podUID="868da124-725c-4740-b2eb-3276ca689a81" containerName="manager" containerID="cri-o://11b1d209f732aa267eb11d0f68b1577816f2c883af08e8af3f110c4f1ef5dadb" gracePeriod=10 Apr 17 17:17:34.285898 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.285875 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" Apr 17 17:17:34.307114 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.307063 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/868da124-725c-4740-b2eb-3276ca689a81-extensions-socket-volume\") pod \"868da124-725c-4740-b2eb-3276ca689a81\" (UID: \"868da124-725c-4740-b2eb-3276ca689a81\") " Apr 17 17:17:34.307114 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.307094 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82fxn\" (UniqueName: \"kubernetes.io/projected/868da124-725c-4740-b2eb-3276ca689a81-kube-api-access-82fxn\") pod \"868da124-725c-4740-b2eb-3276ca689a81\" (UID: \"868da124-725c-4740-b2eb-3276ca689a81\") " Apr 17 17:17:34.307499 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.307468 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/868da124-725c-4740-b2eb-3276ca689a81-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "868da124-725c-4740-b2eb-3276ca689a81" (UID: "868da124-725c-4740-b2eb-3276ca689a81"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:17:34.309011 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.308990 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/868da124-725c-4740-b2eb-3276ca689a81-kube-api-access-82fxn" (OuterVolumeSpecName: "kube-api-access-82fxn") pod "868da124-725c-4740-b2eb-3276ca689a81" (UID: "868da124-725c-4740-b2eb-3276ca689a81"). InnerVolumeSpecName "kube-api-access-82fxn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:17:34.408123 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.408088 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-82fxn\" (UniqueName: \"kubernetes.io/projected/868da124-725c-4740-b2eb-3276ca689a81-kube-api-access-82fxn\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:17:34.408123 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.408120 2568 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/868da124-725c-4740-b2eb-3276ca689a81-extensions-socket-volume\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:17:34.562416 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.562327 2568 generic.go:358] "Generic (PLEG): container finished" podID="868da124-725c-4740-b2eb-3276ca689a81" containerID="11b1d209f732aa267eb11d0f68b1577816f2c883af08e8af3f110c4f1ef5dadb" exitCode=0 Apr 17 17:17:34.562416 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.562389 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" Apr 17 17:17:34.562416 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.562393 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" event={"ID":"868da124-725c-4740-b2eb-3276ca689a81","Type":"ContainerDied","Data":"11b1d209f732aa267eb11d0f68b1577816f2c883af08e8af3f110c4f1ef5dadb"} Apr 17 17:17:34.562673 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.562429 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd" event={"ID":"868da124-725c-4740-b2eb-3276ca689a81","Type":"ContainerDied","Data":"f8b0794bd2c6f4b3eb25a51aef66c0a1c86459619c0825a547b28f7a77700647"} Apr 17 17:17:34.562673 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.562445 2568 scope.go:117] "RemoveContainer" containerID="11b1d209f732aa267eb11d0f68b1577816f2c883af08e8af3f110c4f1ef5dadb" Apr 17 17:17:34.570909 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.570892 2568 scope.go:117] "RemoveContainer" containerID="11b1d209f732aa267eb11d0f68b1577816f2c883af08e8af3f110c4f1ef5dadb" Apr 17 17:17:34.571167 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:17:34.571144 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11b1d209f732aa267eb11d0f68b1577816f2c883af08e8af3f110c4f1ef5dadb\": container with ID starting with 11b1d209f732aa267eb11d0f68b1577816f2c883af08e8af3f110c4f1ef5dadb not found: ID does not exist" containerID="11b1d209f732aa267eb11d0f68b1577816f2c883af08e8af3f110c4f1ef5dadb" Apr 17 17:17:34.571240 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.571178 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11b1d209f732aa267eb11d0f68b1577816f2c883af08e8af3f110c4f1ef5dadb"} err="failed to get container status \"11b1d209f732aa267eb11d0f68b1577816f2c883af08e8af3f110c4f1ef5dadb\": rpc error: code = NotFound desc = could not find container \"11b1d209f732aa267eb11d0f68b1577816f2c883af08e8af3f110c4f1ef5dadb\": container with ID starting with 11b1d209f732aa267eb11d0f68b1577816f2c883af08e8af3f110c4f1ef5dadb not found: ID does not exist" Apr 17 17:17:34.584166 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.584139 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd"] Apr 17 17:17:34.590683 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.590665 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gm4hd"] Apr 17 17:17:34.904496 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:34.904421 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="868da124-725c-4740-b2eb-3276ca689a81" path="/var/lib/kubelet/pods/868da124-725c-4740-b2eb-3276ca689a81/volumes" Apr 17 17:17:50.294944 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.294867 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9"] Apr 17 17:17:50.295391 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.295179 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="868da124-725c-4740-b2eb-3276ca689a81" containerName="manager" Apr 17 17:17:50.295391 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.295193 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="868da124-725c-4740-b2eb-3276ca689a81" containerName="manager" Apr 17 17:17:50.295391 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.295242 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="868da124-725c-4740-b2eb-3276ca689a81" containerName="manager" Apr 17 17:17:50.298800 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.298784 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.302431 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.302411 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-mjqmk\"" Apr 17 17:17:50.312657 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.312629 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9"] Apr 17 17:17:50.321389 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.321327 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.321389 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.321363 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.321563 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.321392 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.321563 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.321428 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.321563 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.321446 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5zlr\" (UniqueName: \"kubernetes.io/projected/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-kube-api-access-n5zlr\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.321563 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.321477 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.321563 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.321506 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.321563 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.321529 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.321872 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.321625 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.422184 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.422149 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.422369 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.422202 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.422369 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.422230 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.422369 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.422255 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.422534 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.422393 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.422534 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.422440 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.422534 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.422471 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.422534 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.422491 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.422534 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.422506 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5zlr\" (UniqueName: \"kubernetes.io/projected/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-kube-api-access-n5zlr\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.422783 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.422620 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.422835 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.422784 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.422884 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.422863 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.422936 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.422921 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.422994 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.422986 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.424506 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.424485 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.424801 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.424784 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.430216 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.430189 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.430321 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.430216 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5zlr\" (UniqueName: \"kubernetes.io/projected/e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d-kube-api-access-n5zlr\") pod \"maas-default-gateway-openshift-default-845c6b4b48-65pp9\" (UID: \"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.608790 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.608708 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:50.730744 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.730713 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9"] Apr 17 17:17:50.733724 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:17:50.733693 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode07126fb_0a4d_4f34_9ab0_5c2c5aa8352d.slice/crio-76ce8f2903e713964d9b10c042b80b88ab804a89127ad4a5452bf4aac4162d1a WatchSource:0}: Error finding container 76ce8f2903e713964d9b10c042b80b88ab804a89127ad4a5452bf4aac4162d1a: Status 404 returned error can't find the container with id 76ce8f2903e713964d9b10c042b80b88ab804a89127ad4a5452bf4aac4162d1a Apr 17 17:17:50.735794 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.735762 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 17:17:50.735865 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.735825 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 17:17:50.735865 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:50.735850 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 17:17:51.616013 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:51.615977 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" event={"ID":"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d","Type":"ContainerStarted","Data":"057a2731562f569c37c13be691db7c3ff49aaa867b4e4cc552a146b14e419d7d"} Apr 17 17:17:51.616013 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:51.616013 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" event={"ID":"e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d","Type":"ContainerStarted","Data":"76ce8f2903e713964d9b10c042b80b88ab804a89127ad4a5452bf4aac4162d1a"} Apr 17 17:17:51.636614 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:51.636565 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" podStartSLOduration=1.63655045 podStartE2EDuration="1.63655045s" podCreationTimestamp="2026-04-17 17:17:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:17:51.634753711 +0000 UTC m=+625.250886987" watchObservedRunningTime="2026-04-17 17:17:51.63655045 +0000 UTC m=+625.252683713" Apr 17 17:17:52.609467 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:52.609436 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:52.614012 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:52.613988 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:52.618787 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:52.618763 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:52.619608 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:52.619591 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-65pp9" Apr 17 17:17:55.219596 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:55.219561 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-w4rq6"] Apr 17 17:17:55.221578 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:55.221562 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" Apr 17 17:17:55.224637 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:55.224615 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-8n2w7\"" Apr 17 17:17:55.224637 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:55.224629 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 17:17:55.231409 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:55.231388 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-w4rq6"] Apr 17 17:17:55.257426 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:55.257402 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wkg8\" (UniqueName: \"kubernetes.io/projected/e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b-kube-api-access-5wkg8\") pod \"limitador-limitador-7d549b5b-w4rq6\" (UID: \"e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b\") " pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" Apr 17 17:17:55.257545 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:55.257433 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b-config-file\") pod \"limitador-limitador-7d549b5b-w4rq6\" (UID: \"e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b\") " pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" Apr 17 17:17:55.317962 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:55.317928 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-w4rq6"] Apr 17 17:17:55.358173 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:55.358137 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5wkg8\" (UniqueName: \"kubernetes.io/projected/e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b-kube-api-access-5wkg8\") pod \"limitador-limitador-7d549b5b-w4rq6\" (UID: \"e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b\") " pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" Apr 17 17:17:55.358173 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:55.358176 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b-config-file\") pod \"limitador-limitador-7d549b5b-w4rq6\" (UID: \"e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b\") " pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" Apr 17 17:17:55.358808 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:55.358783 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b-config-file\") pod \"limitador-limitador-7d549b5b-w4rq6\" (UID: \"e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b\") " pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" Apr 17 17:17:55.366536 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:55.366512 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wkg8\" (UniqueName: \"kubernetes.io/projected/e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b-kube-api-access-5wkg8\") pod \"limitador-limitador-7d549b5b-w4rq6\" (UID: \"e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b\") " pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" Apr 17 17:17:55.531641 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:55.531614 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" Apr 17 17:17:55.651561 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:55.651538 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-w4rq6"] Apr 17 17:17:55.653806 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:17:55.653778 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1d6a406_f8d8_4ab3_b4ad_f16f5654a35b.slice/crio-66b7b2e67e7ca387dc1054b0806f73842a28bcaf39edd5845960899f0ec48802 WatchSource:0}: Error finding container 66b7b2e67e7ca387dc1054b0806f73842a28bcaf39edd5845960899f0ec48802: Status 404 returned error can't find the container with id 66b7b2e67e7ca387dc1054b0806f73842a28bcaf39edd5845960899f0ec48802 Apr 17 17:17:56.142637 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.142609 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-vzm6m"] Apr 17 17:17:56.145376 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.145354 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-vzm6m" Apr 17 17:17:56.148312 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.148281 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-826k6\"" Apr 17 17:17:56.152567 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.152530 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-vzm6m"] Apr 17 17:17:56.265382 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.265353 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6jrq\" (UniqueName: \"kubernetes.io/projected/271b6f97-349c-47cd-8a83-b9852bc0a096-kube-api-access-m6jrq\") pod \"authorino-f99f4b5cd-vzm6m\" (UID: \"271b6f97-349c-47cd-8a83-b9852bc0a096\") " pod="kuadrant-system/authorino-f99f4b5cd-vzm6m" Apr 17 17:17:56.366015 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.365980 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6jrq\" (UniqueName: \"kubernetes.io/projected/271b6f97-349c-47cd-8a83-b9852bc0a096-kube-api-access-m6jrq\") pod \"authorino-f99f4b5cd-vzm6m\" (UID: \"271b6f97-349c-47cd-8a83-b9852bc0a096\") " pod="kuadrant-system/authorino-f99f4b5cd-vzm6m" Apr 17 17:17:56.374995 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.374944 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6jrq\" (UniqueName: \"kubernetes.io/projected/271b6f97-349c-47cd-8a83-b9852bc0a096-kube-api-access-m6jrq\") pod \"authorino-f99f4b5cd-vzm6m\" (UID: \"271b6f97-349c-47cd-8a83-b9852bc0a096\") " pod="kuadrant-system/authorino-f99f4b5cd-vzm6m" Apr 17 17:17:56.445522 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.444965 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-9c4zj"] Apr 17 17:17:56.447756 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.447717 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-9c4zj" Apr 17 17:17:56.455282 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.455258 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-9c4zj"] Apr 17 17:17:56.457396 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.457374 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-vzm6m" Apr 17 17:17:56.568002 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.567852 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bb2g\" (UniqueName: \"kubernetes.io/projected/25ae0567-d16c-4054-98dd-fd48df455143-kube-api-access-9bb2g\") pod \"authorino-7498df8756-9c4zj\" (UID: \"25ae0567-d16c-4054-98dd-fd48df455143\") " pod="kuadrant-system/authorino-7498df8756-9c4zj" Apr 17 17:17:56.613646 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.613240 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-vzm6m"] Apr 17 17:17:56.618378 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:17:56.618345 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod271b6f97_349c_47cd_8a83_b9852bc0a096.slice/crio-823aa738b48c7acfeaf61facc4436fe8e233fb3d211d43c9c80f3570fda72d8c WatchSource:0}: Error finding container 823aa738b48c7acfeaf61facc4436fe8e233fb3d211d43c9c80f3570fda72d8c: Status 404 returned error can't find the container with id 823aa738b48c7acfeaf61facc4436fe8e233fb3d211d43c9c80f3570fda72d8c Apr 17 17:17:56.633115 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.633081 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-vzm6m" event={"ID":"271b6f97-349c-47cd-8a83-b9852bc0a096","Type":"ContainerStarted","Data":"823aa738b48c7acfeaf61facc4436fe8e233fb3d211d43c9c80f3570fda72d8c"} Apr 17 17:17:56.634622 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.634594 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" event={"ID":"e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b","Type":"ContainerStarted","Data":"66b7b2e67e7ca387dc1054b0806f73842a28bcaf39edd5845960899f0ec48802"} Apr 17 17:17:56.668841 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.668803 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bb2g\" (UniqueName: \"kubernetes.io/projected/25ae0567-d16c-4054-98dd-fd48df455143-kube-api-access-9bb2g\") pod \"authorino-7498df8756-9c4zj\" (UID: \"25ae0567-d16c-4054-98dd-fd48df455143\") " pod="kuadrant-system/authorino-7498df8756-9c4zj" Apr 17 17:17:56.678395 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.678372 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bb2g\" (UniqueName: \"kubernetes.io/projected/25ae0567-d16c-4054-98dd-fd48df455143-kube-api-access-9bb2g\") pod \"authorino-7498df8756-9c4zj\" (UID: \"25ae0567-d16c-4054-98dd-fd48df455143\") " pod="kuadrant-system/authorino-7498df8756-9c4zj" Apr 17 17:17:56.761720 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.761629 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-9c4zj" Apr 17 17:17:56.906584 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:56.906558 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-9c4zj"] Apr 17 17:17:56.907588 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:17:56.907562 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25ae0567_d16c_4054_98dd_fd48df455143.slice/crio-3df3b40679e981dc142c5d2ff4750c9cc6d2e3f309bfa201e7e9ebe68350bd7b WatchSource:0}: Error finding container 3df3b40679e981dc142c5d2ff4750c9cc6d2e3f309bfa201e7e9ebe68350bd7b: Status 404 returned error can't find the container with id 3df3b40679e981dc142c5d2ff4750c9cc6d2e3f309bfa201e7e9ebe68350bd7b Apr 17 17:17:57.639144 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:57.639101 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-9c4zj" event={"ID":"25ae0567-d16c-4054-98dd-fd48df455143","Type":"ContainerStarted","Data":"3df3b40679e981dc142c5d2ff4750c9cc6d2e3f309bfa201e7e9ebe68350bd7b"} Apr 17 17:17:58.644414 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:58.644375 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" event={"ID":"e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b","Type":"ContainerStarted","Data":"885b15d9f6b120752673d0e38a62c1e1524917aa62ebbbb4107524c553e40d3e"} Apr 17 17:17:58.644821 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:58.644450 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" Apr 17 17:17:58.663069 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:17:58.662907 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" podStartSLOduration=0.897883308 podStartE2EDuration="3.662889399s" podCreationTimestamp="2026-04-17 17:17:55 +0000 UTC" firstStartedPulling="2026-04-17 17:17:55.655573265 +0000 UTC m=+629.271706507" lastFinishedPulling="2026-04-17 17:17:58.420579354 +0000 UTC m=+632.036712598" observedRunningTime="2026-04-17 17:17:58.660541294 +0000 UTC m=+632.276674550" watchObservedRunningTime="2026-04-17 17:17:58.662889399 +0000 UTC m=+632.279022666" Apr 17 17:18:00.652758 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:00.652719 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-9c4zj" event={"ID":"25ae0567-d16c-4054-98dd-fd48df455143","Type":"ContainerStarted","Data":"4e3e90bfc2f1ef62241c7469d9f16c2d3c096ebf6597576bc278ec8516384575"} Apr 17 17:18:00.654832 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:00.654805 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-vzm6m" event={"ID":"271b6f97-349c-47cd-8a83-b9852bc0a096","Type":"ContainerStarted","Data":"2eccff7e63f3372a366ea482db618e7b939cc1242d0aeb618baedb7667bb627a"} Apr 17 17:18:00.669636 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:00.669573 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-9c4zj" podStartSLOduration=1.061290407 podStartE2EDuration="4.669556202s" podCreationTimestamp="2026-04-17 17:17:56 +0000 UTC" firstStartedPulling="2026-04-17 17:17:56.908879834 +0000 UTC m=+630.525013086" lastFinishedPulling="2026-04-17 17:18:00.517145626 +0000 UTC m=+634.133278881" observedRunningTime="2026-04-17 17:18:00.668168495 +0000 UTC m=+634.284301755" watchObservedRunningTime="2026-04-17 17:18:00.669556202 +0000 UTC m=+634.285689468" Apr 17 17:18:00.684981 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:00.684914 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-vzm6m" podStartSLOduration=0.784125474 podStartE2EDuration="4.684900628s" podCreationTimestamp="2026-04-17 17:17:56 +0000 UTC" firstStartedPulling="2026-04-17 17:17:56.620804319 +0000 UTC m=+630.236937575" lastFinishedPulling="2026-04-17 17:18:00.521579484 +0000 UTC m=+634.137712729" observedRunningTime="2026-04-17 17:18:00.68359358 +0000 UTC m=+634.299726845" watchObservedRunningTime="2026-04-17 17:18:00.684900628 +0000 UTC m=+634.301033891" Apr 17 17:18:00.709580 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:00.709552 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-vzm6m"] Apr 17 17:18:02.661060 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:02.661002 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-vzm6m" podUID="271b6f97-349c-47cd-8a83-b9852bc0a096" containerName="authorino" containerID="cri-o://2eccff7e63f3372a366ea482db618e7b939cc1242d0aeb618baedb7667bb627a" gracePeriod=30 Apr 17 17:18:02.900341 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:02.900320 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-vzm6m" Apr 17 17:18:03.024293 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:03.024262 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6jrq\" (UniqueName: \"kubernetes.io/projected/271b6f97-349c-47cd-8a83-b9852bc0a096-kube-api-access-m6jrq\") pod \"271b6f97-349c-47cd-8a83-b9852bc0a096\" (UID: \"271b6f97-349c-47cd-8a83-b9852bc0a096\") " Apr 17 17:18:03.026181 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:03.026146 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/271b6f97-349c-47cd-8a83-b9852bc0a096-kube-api-access-m6jrq" (OuterVolumeSpecName: "kube-api-access-m6jrq") pod "271b6f97-349c-47cd-8a83-b9852bc0a096" (UID: "271b6f97-349c-47cd-8a83-b9852bc0a096"). InnerVolumeSpecName "kube-api-access-m6jrq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:18:03.125270 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:03.125231 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m6jrq\" (UniqueName: \"kubernetes.io/projected/271b6f97-349c-47cd-8a83-b9852bc0a096-kube-api-access-m6jrq\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:18:03.665728 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:03.665690 2568 generic.go:358] "Generic (PLEG): container finished" podID="271b6f97-349c-47cd-8a83-b9852bc0a096" containerID="2eccff7e63f3372a366ea482db618e7b939cc1242d0aeb618baedb7667bb627a" exitCode=0 Apr 17 17:18:03.666168 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:03.665736 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-vzm6m" event={"ID":"271b6f97-349c-47cd-8a83-b9852bc0a096","Type":"ContainerDied","Data":"2eccff7e63f3372a366ea482db618e7b939cc1242d0aeb618baedb7667bb627a"} Apr 17 17:18:03.666168 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:03.665754 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-vzm6m" Apr 17 17:18:03.666168 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:03.665762 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-vzm6m" event={"ID":"271b6f97-349c-47cd-8a83-b9852bc0a096","Type":"ContainerDied","Data":"823aa738b48c7acfeaf61facc4436fe8e233fb3d211d43c9c80f3570fda72d8c"} Apr 17 17:18:03.666168 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:03.665781 2568 scope.go:117] "RemoveContainer" containerID="2eccff7e63f3372a366ea482db618e7b939cc1242d0aeb618baedb7667bb627a" Apr 17 17:18:03.674092 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:03.674072 2568 scope.go:117] "RemoveContainer" containerID="2eccff7e63f3372a366ea482db618e7b939cc1242d0aeb618baedb7667bb627a" Apr 17 17:18:03.674357 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:18:03.674334 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eccff7e63f3372a366ea482db618e7b939cc1242d0aeb618baedb7667bb627a\": container with ID starting with 2eccff7e63f3372a366ea482db618e7b939cc1242d0aeb618baedb7667bb627a not found: ID does not exist" containerID="2eccff7e63f3372a366ea482db618e7b939cc1242d0aeb618baedb7667bb627a" Apr 17 17:18:03.674407 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:03.674366 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eccff7e63f3372a366ea482db618e7b939cc1242d0aeb618baedb7667bb627a"} err="failed to get container status \"2eccff7e63f3372a366ea482db618e7b939cc1242d0aeb618baedb7667bb627a\": rpc error: code = NotFound desc = could not find container \"2eccff7e63f3372a366ea482db618e7b939cc1242d0aeb618baedb7667bb627a\": container with ID starting with 2eccff7e63f3372a366ea482db618e7b939cc1242d0aeb618baedb7667bb627a not found: ID does not exist" Apr 17 17:18:03.687480 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:03.687454 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-vzm6m"] Apr 17 17:18:03.689971 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:03.689948 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-vzm6m"] Apr 17 17:18:04.905192 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:04.905159 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="271b6f97-349c-47cd-8a83-b9852bc0a096" path="/var/lib/kubelet/pods/271b6f97-349c-47cd-8a83-b9852bc0a096/volumes" Apr 17 17:18:09.649594 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:09.649569 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" Apr 17 17:18:10.610152 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:10.610115 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-w4rq6"] Apr 17 17:18:10.610381 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:10.610341 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" podUID="e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b" containerName="limitador" containerID="cri-o://885b15d9f6b120752673d0e38a62c1e1524917aa62ebbbb4107524c553e40d3e" gracePeriod=30 Apr 17 17:18:11.143062 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:11.143040 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" Apr 17 17:18:11.287467 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:11.287437 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b-config-file\") pod \"e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b\" (UID: \"e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b\") " Apr 17 17:18:11.287642 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:11.287497 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wkg8\" (UniqueName: \"kubernetes.io/projected/e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b-kube-api-access-5wkg8\") pod \"e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b\" (UID: \"e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b\") " Apr 17 17:18:11.287805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:11.287781 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b-config-file" (OuterVolumeSpecName: "config-file") pod "e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b" (UID: "e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:18:11.289573 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:11.289555 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b-kube-api-access-5wkg8" (OuterVolumeSpecName: "kube-api-access-5wkg8") pod "e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b" (UID: "e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b"). InnerVolumeSpecName "kube-api-access-5wkg8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:18:11.388400 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:11.388376 2568 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b-config-file\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:18:11.388400 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:11.388399 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5wkg8\" (UniqueName: \"kubernetes.io/projected/e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b-kube-api-access-5wkg8\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:18:11.693294 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:11.693221 2568 generic.go:358] "Generic (PLEG): container finished" podID="e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b" containerID="885b15d9f6b120752673d0e38a62c1e1524917aa62ebbbb4107524c553e40d3e" exitCode=0 Apr 17 17:18:11.693294 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:11.693280 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" Apr 17 17:18:11.693445 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:11.693291 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" event={"ID":"e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b","Type":"ContainerDied","Data":"885b15d9f6b120752673d0e38a62c1e1524917aa62ebbbb4107524c553e40d3e"} Apr 17 17:18:11.693445 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:11.693338 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-w4rq6" event={"ID":"e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b","Type":"ContainerDied","Data":"66b7b2e67e7ca387dc1054b0806f73842a28bcaf39edd5845960899f0ec48802"} Apr 17 17:18:11.693445 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:11.693354 2568 scope.go:117] "RemoveContainer" containerID="885b15d9f6b120752673d0e38a62c1e1524917aa62ebbbb4107524c553e40d3e" Apr 17 17:18:11.701174 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:11.701156 2568 scope.go:117] "RemoveContainer" containerID="885b15d9f6b120752673d0e38a62c1e1524917aa62ebbbb4107524c553e40d3e" Apr 17 17:18:11.701528 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:18:11.701505 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"885b15d9f6b120752673d0e38a62c1e1524917aa62ebbbb4107524c553e40d3e\": container with ID starting with 885b15d9f6b120752673d0e38a62c1e1524917aa62ebbbb4107524c553e40d3e not found: ID does not exist" containerID="885b15d9f6b120752673d0e38a62c1e1524917aa62ebbbb4107524c553e40d3e" Apr 17 17:18:11.701627 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:11.701533 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"885b15d9f6b120752673d0e38a62c1e1524917aa62ebbbb4107524c553e40d3e"} err="failed to get container status \"885b15d9f6b120752673d0e38a62c1e1524917aa62ebbbb4107524c553e40d3e\": rpc error: code = NotFound desc = could not find container \"885b15d9f6b120752673d0e38a62c1e1524917aa62ebbbb4107524c553e40d3e\": container with ID starting with 885b15d9f6b120752673d0e38a62c1e1524917aa62ebbbb4107524c553e40d3e not found: ID does not exist" Apr 17 17:18:11.723939 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:11.723917 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-w4rq6"] Apr 17 17:18:11.740262 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:11.740244 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-w4rq6"] Apr 17 17:18:12.905388 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:12.905353 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b" path="/var/lib/kubelet/pods/e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b/volumes" Apr 17 17:18:16.181741 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.181706 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-qjzlx"] Apr 17 17:18:16.182223 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.181994 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b" containerName="limitador" Apr 17 17:18:16.182223 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.182009 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b" containerName="limitador" Apr 17 17:18:16.182223 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.182031 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="271b6f97-349c-47cd-8a83-b9852bc0a096" containerName="authorino" Apr 17 17:18:16.182223 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.182039 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="271b6f97-349c-47cd-8a83-b9852bc0a096" containerName="authorino" Apr 17 17:18:16.182223 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.182106 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1d6a406-f8d8-4ab3-b4ad-f16f5654a35b" containerName="limitador" Apr 17 17:18:16.182223 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.182118 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="271b6f97-349c-47cd-8a83-b9852bc0a096" containerName="authorino" Apr 17 17:18:16.184614 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.184591 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-qjzlx" Apr 17 17:18:16.189426 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.189401 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 17 17:18:16.189557 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.189477 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-qbxkw\"" Apr 17 17:18:16.197508 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.197486 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-qjzlx"] Apr 17 17:18:16.329780 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.329748 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1e259cda-389b-4038-8c99-8b966fbdd99c-data\") pod \"postgres-868db5846d-qjzlx\" (UID: \"1e259cda-389b-4038-8c99-8b966fbdd99c\") " pod="opendatahub/postgres-868db5846d-qjzlx" Apr 17 17:18:16.329978 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.329804 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbqz2\" (UniqueName: \"kubernetes.io/projected/1e259cda-389b-4038-8c99-8b966fbdd99c-kube-api-access-pbqz2\") pod \"postgres-868db5846d-qjzlx\" (UID: \"1e259cda-389b-4038-8c99-8b966fbdd99c\") " pod="opendatahub/postgres-868db5846d-qjzlx" Apr 17 17:18:16.430553 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.430519 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1e259cda-389b-4038-8c99-8b966fbdd99c-data\") pod \"postgres-868db5846d-qjzlx\" (UID: \"1e259cda-389b-4038-8c99-8b966fbdd99c\") " pod="opendatahub/postgres-868db5846d-qjzlx" Apr 17 17:18:16.430703 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.430570 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbqz2\" (UniqueName: \"kubernetes.io/projected/1e259cda-389b-4038-8c99-8b966fbdd99c-kube-api-access-pbqz2\") pod \"postgres-868db5846d-qjzlx\" (UID: \"1e259cda-389b-4038-8c99-8b966fbdd99c\") " pod="opendatahub/postgres-868db5846d-qjzlx" Apr 17 17:18:16.430894 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.430875 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1e259cda-389b-4038-8c99-8b966fbdd99c-data\") pod \"postgres-868db5846d-qjzlx\" (UID: \"1e259cda-389b-4038-8c99-8b966fbdd99c\") " pod="opendatahub/postgres-868db5846d-qjzlx" Apr 17 17:18:16.438593 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.438525 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbqz2\" (UniqueName: \"kubernetes.io/projected/1e259cda-389b-4038-8c99-8b966fbdd99c-kube-api-access-pbqz2\") pod \"postgres-868db5846d-qjzlx\" (UID: \"1e259cda-389b-4038-8c99-8b966fbdd99c\") " pod="opendatahub/postgres-868db5846d-qjzlx" Apr 17 17:18:16.497400 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.497370 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-qjzlx" Apr 17 17:18:16.610147 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.610116 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-qjzlx"] Apr 17 17:18:16.613542 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:18:16.613505 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e259cda_389b_4038_8c99_8b966fbdd99c.slice/crio-ed6769515a706cc3b29fab0df9dbd48bf6c4e5116c1f105175c445ddcaf4e766 WatchSource:0}: Error finding container ed6769515a706cc3b29fab0df9dbd48bf6c4e5116c1f105175c445ddcaf4e766: Status 404 returned error can't find the container with id ed6769515a706cc3b29fab0df9dbd48bf6c4e5116c1f105175c445ddcaf4e766 Apr 17 17:18:16.710835 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:16.710752 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-qjzlx" event={"ID":"1e259cda-389b-4038-8c99-8b966fbdd99c","Type":"ContainerStarted","Data":"ed6769515a706cc3b29fab0df9dbd48bf6c4e5116c1f105175c445ddcaf4e766"} Apr 17 17:18:19.300450 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:18:19.300397 2568 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/f5/f57fa8c0c6149ed227f93ff8adbe8ee7e70339fc8bb83bf697c1b8a30d0f7f33?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260417%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260417T171816Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=29ca32f600fbd0ae2673daee5c5bcdee5e4f69b9efa456bd214d0f92a1d0f17c®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=rhel9----postgresql-16&akamai_signature=exp=1776447196~hmac=da42cf8b333ed43025d29d220bb1d164f1a74bbbdf96187d0aec639c4639a837\": remote error: tls: internal error; artifact err: provided artifact is a container image" image="registry.redhat.io/rhel9/postgresql-16:latest@sha256:680b42d2c51b76d23cd5b68dd774af456b1e4c98c4aaeb49d0de0948dc933716" Apr 17 17:18:19.300816 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:18:19.300599 2568 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:postgres,Image:registry.redhat.io/rhel9/postgresql-16:latest@sha256:680b42d2c51b76d23cd5b68dd774af456b1e4c98c4aaeb49d0de0948dc933716,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:,HostPort:0,ContainerPort:5432,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POSTGRESQL_USER,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:postgres-creds,},Key:POSTGRES_USER,Optional:nil,},},},EnvVar{Name:POSTGRESQL_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:postgres-creds,},Key:POSTGRES_PASSWORD,Optional:nil,},},},EnvVar{Name:POSTGRESQL_DATABASE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:postgres-creds,},Key:POSTGRES_DB,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:data,ReadOnly:false,MountPath:/var/lib/pgsql/data,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pbqz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/libexec/check-container],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod postgres-868db5846d-qjzlx_opendatahub(1e259cda-389b-4038-8c99-8b966fbdd99c): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/f5/f57fa8c0c6149ed227f93ff8adbe8ee7e70339fc8bb83bf697c1b8a30d0f7f33?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260417%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260417T171816Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=29ca32f600fbd0ae2673daee5c5bcdee5e4f69b9efa456bd214d0f92a1d0f17c®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=rhel9----postgresql-16&akamai_signature=exp=1776447196~hmac=da42cf8b333ed43025d29d220bb1d164f1a74bbbdf96187d0aec639c4639a837\": remote error: tls: internal error; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 17:18:19.301797 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:18:19.301763 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"postgres\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/f5/f57fa8c0c6149ed227f93ff8adbe8ee7e70339fc8bb83bf697c1b8a30d0f7f33?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260417%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260417T171816Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=29ca32f600fbd0ae2673daee5c5bcdee5e4f69b9efa456bd214d0f92a1d0f17c®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=rhel9----postgresql-16&akamai_signature=exp=1776447196~hmac=da42cf8b333ed43025d29d220bb1d164f1a74bbbdf96187d0aec639c4639a837\\\": remote error: tls: internal error; artifact err: provided artifact is a container image\"" pod="opendatahub/postgres-868db5846d-qjzlx" podUID="1e259cda-389b-4038-8c99-8b966fbdd99c" Apr 17 17:18:19.721676 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:18:19.721616 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"postgres\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/rhel9/postgresql-16:latest@sha256:680b42d2c51b76d23cd5b68dd774af456b1e4c98c4aaeb49d0de0948dc933716\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/f5/f57fa8c0c6149ed227f93ff8adbe8ee7e70339fc8bb83bf697c1b8a30d0f7f33?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260417%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260417T171816Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=29ca32f600fbd0ae2673daee5c5bcdee5e4f69b9efa456bd214d0f92a1d0f17c®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=rhel9----postgresql-16&akamai_signature=exp=1776447196~hmac=da42cf8b333ed43025d29d220bb1d164f1a74bbbdf96187d0aec639c4639a837\\\": remote error: tls: internal error; artifact err: provided artifact is a container image\"" pod="opendatahub/postgres-868db5846d-qjzlx" podUID="1e259cda-389b-4038-8c99-8b966fbdd99c" Apr 17 17:18:36.654591 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:36.654566 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 17 17:18:36.779375 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:36.779334 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-qjzlx" event={"ID":"1e259cda-389b-4038-8c99-8b966fbdd99c","Type":"ContainerStarted","Data":"4aab86a54d9ad0121e3b70efd05b23c6a0df7cd6dec826c137ef2a770a0e1897"} Apr 17 17:18:36.779596 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:36.779581 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-qjzlx" Apr 17 17:18:36.796700 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:36.796646 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-qjzlx" podStartSLOduration=0.759633988 podStartE2EDuration="20.796634255s" podCreationTimestamp="2026-04-17 17:18:16 +0000 UTC" firstStartedPulling="2026-04-17 17:18:16.614727184 +0000 UTC m=+650.230860425" lastFinishedPulling="2026-04-17 17:18:36.651727451 +0000 UTC m=+670.267860692" observedRunningTime="2026-04-17 17:18:36.795768975 +0000 UTC m=+670.411902237" watchObservedRunningTime="2026-04-17 17:18:36.796634255 +0000 UTC m=+670.412767518" Apr 17 17:18:42.812168 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:42.812136 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-qjzlx" Apr 17 17:18:43.743169 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:43.743134 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-74d7bfb697-pczxp"] Apr 17 17:18:43.745115 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:43.745099 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-74d7bfb697-pczxp" Apr 17 17:18:43.748198 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:43.748169 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 17:18:43.748321 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:43.748201 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-7k4fn\"" Apr 17 17:18:43.748321 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:43.748213 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 17:18:43.755666 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:43.755645 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-74d7bfb697-pczxp"] Apr 17 17:18:43.849669 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:43.849634 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/76790239-9ac4-4ab5-9585-45554de53103-maas-api-tls\") pod \"maas-api-74d7bfb697-pczxp\" (UID: \"76790239-9ac4-4ab5-9585-45554de53103\") " pod="opendatahub/maas-api-74d7bfb697-pczxp" Apr 17 17:18:43.850044 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:43.849686 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch7kk\" (UniqueName: \"kubernetes.io/projected/76790239-9ac4-4ab5-9585-45554de53103-kube-api-access-ch7kk\") pod \"maas-api-74d7bfb697-pczxp\" (UID: \"76790239-9ac4-4ab5-9585-45554de53103\") " pod="opendatahub/maas-api-74d7bfb697-pczxp" Apr 17 17:18:43.950604 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:43.950573 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/76790239-9ac4-4ab5-9585-45554de53103-maas-api-tls\") pod \"maas-api-74d7bfb697-pczxp\" (UID: \"76790239-9ac4-4ab5-9585-45554de53103\") " pod="opendatahub/maas-api-74d7bfb697-pczxp" Apr 17 17:18:43.950765 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:43.950615 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ch7kk\" (UniqueName: \"kubernetes.io/projected/76790239-9ac4-4ab5-9585-45554de53103-kube-api-access-ch7kk\") pod \"maas-api-74d7bfb697-pczxp\" (UID: \"76790239-9ac4-4ab5-9585-45554de53103\") " pod="opendatahub/maas-api-74d7bfb697-pczxp" Apr 17 17:18:43.950765 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:18:43.950726 2568 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 17 17:18:43.950836 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:18:43.950800 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76790239-9ac4-4ab5-9585-45554de53103-maas-api-tls podName:76790239-9ac4-4ab5-9585-45554de53103 nodeName:}" failed. No retries permitted until 2026-04-17 17:18:44.450782814 +0000 UTC m=+678.066916057 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/76790239-9ac4-4ab5-9585-45554de53103-maas-api-tls") pod "maas-api-74d7bfb697-pczxp" (UID: "76790239-9ac4-4ab5-9585-45554de53103") : secret "maas-api-serving-cert" not found Apr 17 17:18:43.961590 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:43.961561 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch7kk\" (UniqueName: \"kubernetes.io/projected/76790239-9ac4-4ab5-9585-45554de53103-kube-api-access-ch7kk\") pod \"maas-api-74d7bfb697-pczxp\" (UID: \"76790239-9ac4-4ab5-9585-45554de53103\") " pod="opendatahub/maas-api-74d7bfb697-pczxp" Apr 17 17:18:44.454161 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.454124 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/76790239-9ac4-4ab5-9585-45554de53103-maas-api-tls\") pod \"maas-api-74d7bfb697-pczxp\" (UID: \"76790239-9ac4-4ab5-9585-45554de53103\") " pod="opendatahub/maas-api-74d7bfb697-pczxp" Apr 17 17:18:44.456493 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.456472 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/76790239-9ac4-4ab5-9585-45554de53103-maas-api-tls\") pod \"maas-api-74d7bfb697-pczxp\" (UID: \"76790239-9ac4-4ab5-9585-45554de53103\") " pod="opendatahub/maas-api-74d7bfb697-pczxp" Apr 17 17:18:44.491343 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.491316 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-lmlrl"] Apr 17 17:18:44.493388 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.493373 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-lmlrl" Apr 17 17:18:44.502835 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.502811 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-lmlrl"] Apr 17 17:18:44.554772 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.554745 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpvzb\" (UniqueName: \"kubernetes.io/projected/fb54beea-fc14-40e0-9b4f-6dca2d9d4177-kube-api-access-gpvzb\") pod \"authorino-8b475cf9f-lmlrl\" (UID: \"fb54beea-fc14-40e0-9b4f-6dca2d9d4177\") " pod="kuadrant-system/authorino-8b475cf9f-lmlrl" Apr 17 17:18:44.655175 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.655150 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-74d7bfb697-pczxp" Apr 17 17:18:44.655300 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.655273 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpvzb\" (UniqueName: \"kubernetes.io/projected/fb54beea-fc14-40e0-9b4f-6dca2d9d4177-kube-api-access-gpvzb\") pod \"authorino-8b475cf9f-lmlrl\" (UID: \"fb54beea-fc14-40e0-9b4f-6dca2d9d4177\") " pod="kuadrant-system/authorino-8b475cf9f-lmlrl" Apr 17 17:18:44.663520 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.663499 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpvzb\" (UniqueName: \"kubernetes.io/projected/fb54beea-fc14-40e0-9b4f-6dca2d9d4177-kube-api-access-gpvzb\") pod \"authorino-8b475cf9f-lmlrl\" (UID: \"fb54beea-fc14-40e0-9b4f-6dca2d9d4177\") " pod="kuadrant-system/authorino-8b475cf9f-lmlrl" Apr 17 17:18:44.713474 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.713438 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-lmlrl"] Apr 17 17:18:44.713741 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.713724 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-lmlrl" Apr 17 17:18:44.739930 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.739897 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5bd449c8f5-4zv95"] Apr 17 17:18:44.743660 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.743581 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5bd449c8f5-4zv95" Apr 17 17:18:44.750197 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.750145 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5bd449c8f5-4zv95"] Apr 17 17:18:44.757005 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.756863 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kgzp\" (UniqueName: \"kubernetes.io/projected/787b1cad-2a68-4df7-8ad9-3b51b346a563-kube-api-access-5kgzp\") pod \"authorino-5bd449c8f5-4zv95\" (UID: \"787b1cad-2a68-4df7-8ad9-3b51b346a563\") " pod="kuadrant-system/authorino-5bd449c8f5-4zv95" Apr 17 17:18:44.852564 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.851212 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5bd449c8f5-4zv95"] Apr 17 17:18:44.859364 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.856203 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-74d7bfb697-pczxp"] Apr 17 17:18:44.859364 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:18:44.858199 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-5kgzp], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-5bd449c8f5-4zv95" podUID="787b1cad-2a68-4df7-8ad9-3b51b346a563" Apr 17 17:18:44.859859 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.859693 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kgzp\" (UniqueName: \"kubernetes.io/projected/787b1cad-2a68-4df7-8ad9-3b51b346a563-kube-api-access-5kgzp\") pod \"authorino-5bd449c8f5-4zv95\" (UID: \"787b1cad-2a68-4df7-8ad9-3b51b346a563\") " pod="kuadrant-system/authorino-5bd449c8f5-4zv95" Apr 17 17:18:44.861125 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:18:44.861058 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76790239_9ac4_4ab5_9585_45554de53103.slice/crio-93605a6182ac3437f2601e1b5dbbf7883211859a98bc55fbf72924891c17b3df WatchSource:0}: Error finding container 93605a6182ac3437f2601e1b5dbbf7883211859a98bc55fbf72924891c17b3df: Status 404 returned error can't find the container with id 93605a6182ac3437f2601e1b5dbbf7883211859a98bc55fbf72924891c17b3df Apr 17 17:18:44.874046 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.874007 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kgzp\" (UniqueName: \"kubernetes.io/projected/787b1cad-2a68-4df7-8ad9-3b51b346a563-kube-api-access-5kgzp\") pod \"authorino-5bd449c8f5-4zv95\" (UID: \"787b1cad-2a68-4df7-8ad9-3b51b346a563\") " pod="kuadrant-system/authorino-5bd449c8f5-4zv95" Apr 17 17:18:44.878221 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.878197 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-798cc5f5dc-7pf9x"] Apr 17 17:18:44.881268 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.880598 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-798cc5f5dc-7pf9x" Apr 17 17:18:44.883673 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.883657 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 17:18:44.889456 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.889366 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-798cc5f5dc-7pf9x"] Apr 17 17:18:44.911672 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.911646 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-lmlrl"] Apr 17 17:18:44.914229 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:18:44.914208 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb54beea_fc14_40e0_9b4f_6dca2d9d4177.slice/crio-be62ec31462f33ba62a33423ded2bdd0f2de22d02c2ea27c79760ffa2df61ca4 WatchSource:0}: Error finding container be62ec31462f33ba62a33423ded2bdd0f2de22d02c2ea27c79760ffa2df61ca4: Status 404 returned error can't find the container with id be62ec31462f33ba62a33423ded2bdd0f2de22d02c2ea27c79760ffa2df61ca4 Apr 17 17:18:44.960879 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.960787 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9jwm\" (UniqueName: \"kubernetes.io/projected/59a9adf1-0757-419c-8b6a-bdcbb28b1e34-kube-api-access-n9jwm\") pod \"authorino-798cc5f5dc-7pf9x\" (UID: \"59a9adf1-0757-419c-8b6a-bdcbb28b1e34\") " pod="kuadrant-system/authorino-798cc5f5dc-7pf9x" Apr 17 17:18:44.960879 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:44.960849 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/59a9adf1-0757-419c-8b6a-bdcbb28b1e34-tls-cert\") pod \"authorino-798cc5f5dc-7pf9x\" (UID: \"59a9adf1-0757-419c-8b6a-bdcbb28b1e34\") " pod="kuadrant-system/authorino-798cc5f5dc-7pf9x" Apr 17 17:18:45.061557 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:45.061520 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/59a9adf1-0757-419c-8b6a-bdcbb28b1e34-tls-cert\") pod \"authorino-798cc5f5dc-7pf9x\" (UID: \"59a9adf1-0757-419c-8b6a-bdcbb28b1e34\") " pod="kuadrant-system/authorino-798cc5f5dc-7pf9x" Apr 17 17:18:45.061741 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:45.061605 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9jwm\" (UniqueName: \"kubernetes.io/projected/59a9adf1-0757-419c-8b6a-bdcbb28b1e34-kube-api-access-n9jwm\") pod \"authorino-798cc5f5dc-7pf9x\" (UID: \"59a9adf1-0757-419c-8b6a-bdcbb28b1e34\") " pod="kuadrant-system/authorino-798cc5f5dc-7pf9x" Apr 17 17:18:45.063911 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:45.063890 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/59a9adf1-0757-419c-8b6a-bdcbb28b1e34-tls-cert\") pod \"authorino-798cc5f5dc-7pf9x\" (UID: \"59a9adf1-0757-419c-8b6a-bdcbb28b1e34\") " pod="kuadrant-system/authorino-798cc5f5dc-7pf9x" Apr 17 17:18:45.072724 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:45.072700 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9jwm\" (UniqueName: \"kubernetes.io/projected/59a9adf1-0757-419c-8b6a-bdcbb28b1e34-kube-api-access-n9jwm\") pod \"authorino-798cc5f5dc-7pf9x\" (UID: \"59a9adf1-0757-419c-8b6a-bdcbb28b1e34\") " pod="kuadrant-system/authorino-798cc5f5dc-7pf9x" Apr 17 17:18:45.191820 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:45.191783 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-798cc5f5dc-7pf9x" Apr 17 17:18:45.343328 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:45.340747 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-798cc5f5dc-7pf9x"] Apr 17 17:18:45.357337 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:18:45.357288 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59a9adf1_0757_419c_8b6a_bdcbb28b1e34.slice/crio-4d962c46e2748d7f86f5617ea11406d1a5110fce79ba6d088e8d89c32a80420e WatchSource:0}: Error finding container 4d962c46e2748d7f86f5617ea11406d1a5110fce79ba6d088e8d89c32a80420e: Status 404 returned error can't find the container with id 4d962c46e2748d7f86f5617ea11406d1a5110fce79ba6d088e8d89c32a80420e Apr 17 17:18:45.813034 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:45.812886 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-lmlrl" event={"ID":"fb54beea-fc14-40e0-9b4f-6dca2d9d4177","Type":"ContainerStarted","Data":"244110456d9b6c091d014b3c613a7ae77443526eee899da3236d80249b3f3afa"} Apr 17 17:18:45.813034 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:45.812928 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-lmlrl" event={"ID":"fb54beea-fc14-40e0-9b4f-6dca2d9d4177","Type":"ContainerStarted","Data":"be62ec31462f33ba62a33423ded2bdd0f2de22d02c2ea27c79760ffa2df61ca4"} Apr 17 17:18:45.813034 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:45.812945 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-lmlrl" podUID="fb54beea-fc14-40e0-9b4f-6dca2d9d4177" containerName="authorino" containerID="cri-o://244110456d9b6c091d014b3c613a7ae77443526eee899da3236d80249b3f3afa" gracePeriod=30 Apr 17 17:18:45.814320 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:45.814284 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-74d7bfb697-pczxp" event={"ID":"76790239-9ac4-4ab5-9585-45554de53103","Type":"ContainerStarted","Data":"93605a6182ac3437f2601e1b5dbbf7883211859a98bc55fbf72924891c17b3df"} Apr 17 17:18:45.815445 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:45.815426 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5bd449c8f5-4zv95" Apr 17 17:18:45.815739 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:45.815720 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-798cc5f5dc-7pf9x" event={"ID":"59a9adf1-0757-419c-8b6a-bdcbb28b1e34","Type":"ContainerStarted","Data":"4d962c46e2748d7f86f5617ea11406d1a5110fce79ba6d088e8d89c32a80420e"} Apr 17 17:18:45.844023 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:45.841444 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5bd449c8f5-4zv95" Apr 17 17:18:45.867299 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:45.867193 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kgzp\" (UniqueName: \"kubernetes.io/projected/787b1cad-2a68-4df7-8ad9-3b51b346a563-kube-api-access-5kgzp\") pod \"787b1cad-2a68-4df7-8ad9-3b51b346a563\" (UID: \"787b1cad-2a68-4df7-8ad9-3b51b346a563\") " Apr 17 17:18:45.869498 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:45.869473 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/787b1cad-2a68-4df7-8ad9-3b51b346a563-kube-api-access-5kgzp" (OuterVolumeSpecName: "kube-api-access-5kgzp") pod "787b1cad-2a68-4df7-8ad9-3b51b346a563" (UID: "787b1cad-2a68-4df7-8ad9-3b51b346a563"). InnerVolumeSpecName "kube-api-access-5kgzp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:18:45.968188 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:45.968164 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5kgzp\" (UniqueName: \"kubernetes.io/projected/787b1cad-2a68-4df7-8ad9-3b51b346a563-kube-api-access-5kgzp\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:18:46.155123 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.155094 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-lmlrl" Apr 17 17:18:46.171819 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.170066 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpvzb\" (UniqueName: \"kubernetes.io/projected/fb54beea-fc14-40e0-9b4f-6dca2d9d4177-kube-api-access-gpvzb\") pod \"fb54beea-fc14-40e0-9b4f-6dca2d9d4177\" (UID: \"fb54beea-fc14-40e0-9b4f-6dca2d9d4177\") " Apr 17 17:18:46.173204 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.173165 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb54beea-fc14-40e0-9b4f-6dca2d9d4177-kube-api-access-gpvzb" (OuterVolumeSpecName: "kube-api-access-gpvzb") pod "fb54beea-fc14-40e0-9b4f-6dca2d9d4177" (UID: "fb54beea-fc14-40e0-9b4f-6dca2d9d4177"). InnerVolumeSpecName "kube-api-access-gpvzb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:18:46.271756 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.271726 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gpvzb\" (UniqueName: \"kubernetes.io/projected/fb54beea-fc14-40e0-9b4f-6dca2d9d4177-kube-api-access-gpvzb\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:18:46.845352 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.820264 2568 generic.go:358] "Generic (PLEG): container finished" podID="fb54beea-fc14-40e0-9b4f-6dca2d9d4177" containerID="244110456d9b6c091d014b3c613a7ae77443526eee899da3236d80249b3f3afa" exitCode=0 Apr 17 17:18:46.845352 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.820346 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-lmlrl" Apr 17 17:18:46.845352 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.835354 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-lmlrl" event={"ID":"fb54beea-fc14-40e0-9b4f-6dca2d9d4177","Type":"ContainerDied","Data":"244110456d9b6c091d014b3c613a7ae77443526eee899da3236d80249b3f3afa"} Apr 17 17:18:46.845352 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.835400 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-lmlrl" event={"ID":"fb54beea-fc14-40e0-9b4f-6dca2d9d4177","Type":"ContainerDied","Data":"be62ec31462f33ba62a33423ded2bdd0f2de22d02c2ea27c79760ffa2df61ca4"} Apr 17 17:18:46.845352 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.835423 2568 scope.go:117] "RemoveContainer" containerID="244110456d9b6c091d014b3c613a7ae77443526eee899da3236d80249b3f3afa" Apr 17 17:18:46.847686 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.847480 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5bd449c8f5-4zv95" Apr 17 17:18:46.847686 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.847513 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-798cc5f5dc-7pf9x" event={"ID":"59a9adf1-0757-419c-8b6a-bdcbb28b1e34","Type":"ContainerStarted","Data":"c9cb7f0d99c10b643a963ab2dd843b618c9c44ba271214de178983815baac51c"} Apr 17 17:18:46.863833 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.863787 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-798cc5f5dc-7pf9x" podStartSLOduration=2.4726876 podStartE2EDuration="2.863771568s" podCreationTimestamp="2026-04-17 17:18:44 +0000 UTC" firstStartedPulling="2026-04-17 17:18:45.358638302 +0000 UTC m=+678.974771547" lastFinishedPulling="2026-04-17 17:18:45.749722272 +0000 UTC m=+679.365855515" observedRunningTime="2026-04-17 17:18:46.861664809 +0000 UTC m=+680.477798069" watchObservedRunningTime="2026-04-17 17:18:46.863771568 +0000 UTC m=+680.479904832" Apr 17 17:18:46.878883 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.878854 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-lmlrl"] Apr 17 17:18:46.881203 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.881182 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-lmlrl"] Apr 17 17:18:46.885416 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.885387 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-9c4zj"] Apr 17 17:18:46.885618 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.885594 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-9c4zj" podUID="25ae0567-d16c-4054-98dd-fd48df455143" containerName="authorino" containerID="cri-o://4e3e90bfc2f1ef62241c7469d9f16c2d3c096ebf6597576bc278ec8516384575" gracePeriod=30 Apr 17 17:18:46.907122 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.907091 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb54beea-fc14-40e0-9b4f-6dca2d9d4177" path="/var/lib/kubelet/pods/fb54beea-fc14-40e0-9b4f-6dca2d9d4177/volumes" Apr 17 17:18:46.907495 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.907457 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5bd449c8f5-4zv95"] Apr 17 17:18:46.909772 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:46.909749 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-5bd449c8f5-4zv95"] Apr 17 17:18:47.217075 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.217046 2568 scope.go:117] "RemoveContainer" containerID="244110456d9b6c091d014b3c613a7ae77443526eee899da3236d80249b3f3afa" Apr 17 17:18:47.217451 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:18:47.217423 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"244110456d9b6c091d014b3c613a7ae77443526eee899da3236d80249b3f3afa\": container with ID starting with 244110456d9b6c091d014b3c613a7ae77443526eee899da3236d80249b3f3afa not found: ID does not exist" containerID="244110456d9b6c091d014b3c613a7ae77443526eee899da3236d80249b3f3afa" Apr 17 17:18:47.217547 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.217465 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244110456d9b6c091d014b3c613a7ae77443526eee899da3236d80249b3f3afa"} err="failed to get container status \"244110456d9b6c091d014b3c613a7ae77443526eee899da3236d80249b3f3afa\": rpc error: code = NotFound desc = could not find container \"244110456d9b6c091d014b3c613a7ae77443526eee899da3236d80249b3f3afa\": container with ID starting with 244110456d9b6c091d014b3c613a7ae77443526eee899da3236d80249b3f3afa not found: ID does not exist" Apr 17 17:18:47.661597 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.661576 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-9c4zj" Apr 17 17:18:47.684568 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.684546 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bb2g\" (UniqueName: \"kubernetes.io/projected/25ae0567-d16c-4054-98dd-fd48df455143-kube-api-access-9bb2g\") pod \"25ae0567-d16c-4054-98dd-fd48df455143\" (UID: \"25ae0567-d16c-4054-98dd-fd48df455143\") " Apr 17 17:18:47.686667 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.686637 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25ae0567-d16c-4054-98dd-fd48df455143-kube-api-access-9bb2g" (OuterVolumeSpecName: "kube-api-access-9bb2g") pod "25ae0567-d16c-4054-98dd-fd48df455143" (UID: "25ae0567-d16c-4054-98dd-fd48df455143"). InnerVolumeSpecName "kube-api-access-9bb2g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:18:47.786120 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.786054 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9bb2g\" (UniqueName: \"kubernetes.io/projected/25ae0567-d16c-4054-98dd-fd48df455143-kube-api-access-9bb2g\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:18:47.854024 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.853985 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-74d7bfb697-pczxp" event={"ID":"76790239-9ac4-4ab5-9585-45554de53103","Type":"ContainerStarted","Data":"d9d643f3881c338acdb9172860ff846e415903343f00261936ad7aacd141b0c4"} Apr 17 17:18:47.854166 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.854035 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-74d7bfb697-pczxp" Apr 17 17:18:47.854960 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.854935 2568 generic.go:358] "Generic (PLEG): container finished" podID="25ae0567-d16c-4054-98dd-fd48df455143" containerID="4e3e90bfc2f1ef62241c7469d9f16c2d3c096ebf6597576bc278ec8516384575" exitCode=0 Apr 17 17:18:47.855047 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.854975 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-9c4zj" Apr 17 17:18:47.855047 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.855008 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-9c4zj" event={"ID":"25ae0567-d16c-4054-98dd-fd48df455143","Type":"ContainerDied","Data":"4e3e90bfc2f1ef62241c7469d9f16c2d3c096ebf6597576bc278ec8516384575"} Apr 17 17:18:47.855047 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.855034 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-9c4zj" event={"ID":"25ae0567-d16c-4054-98dd-fd48df455143","Type":"ContainerDied","Data":"3df3b40679e981dc142c5d2ff4750c9cc6d2e3f309bfa201e7e9ebe68350bd7b"} Apr 17 17:18:47.855149 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.855052 2568 scope.go:117] "RemoveContainer" containerID="4e3e90bfc2f1ef62241c7469d9f16c2d3c096ebf6597576bc278ec8516384575" Apr 17 17:18:47.863176 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.863160 2568 scope.go:117] "RemoveContainer" containerID="4e3e90bfc2f1ef62241c7469d9f16c2d3c096ebf6597576bc278ec8516384575" Apr 17 17:18:47.863440 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:18:47.863423 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3e90bfc2f1ef62241c7469d9f16c2d3c096ebf6597576bc278ec8516384575\": container with ID starting with 4e3e90bfc2f1ef62241c7469d9f16c2d3c096ebf6597576bc278ec8516384575 not found: ID does not exist" containerID="4e3e90bfc2f1ef62241c7469d9f16c2d3c096ebf6597576bc278ec8516384575" Apr 17 17:18:47.863505 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.863446 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3e90bfc2f1ef62241c7469d9f16c2d3c096ebf6597576bc278ec8516384575"} err="failed to get container status \"4e3e90bfc2f1ef62241c7469d9f16c2d3c096ebf6597576bc278ec8516384575\": rpc error: code = NotFound desc = could not find container \"4e3e90bfc2f1ef62241c7469d9f16c2d3c096ebf6597576bc278ec8516384575\": container with ID starting with 4e3e90bfc2f1ef62241c7469d9f16c2d3c096ebf6597576bc278ec8516384575 not found: ID does not exist" Apr 17 17:18:47.874058 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.874017 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-74d7bfb697-pczxp" podStartSLOduration=2.176987243 podStartE2EDuration="4.874005641s" podCreationTimestamp="2026-04-17 17:18:43 +0000 UTC" firstStartedPulling="2026-04-17 17:18:44.862812309 +0000 UTC m=+678.478945549" lastFinishedPulling="2026-04-17 17:18:47.559830702 +0000 UTC m=+681.175963947" observedRunningTime="2026-04-17 17:18:47.872812792 +0000 UTC m=+681.488946054" watchObservedRunningTime="2026-04-17 17:18:47.874005641 +0000 UTC m=+681.490138882" Apr 17 17:18:47.884509 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.884485 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-9c4zj"] Apr 17 17:18:47.886108 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:47.886091 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-9c4zj"] Apr 17 17:18:48.906116 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:48.906082 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25ae0567-d16c-4054-98dd-fd48df455143" path="/var/lib/kubelet/pods/25ae0567-d16c-4054-98dd-fd48df455143/volumes" Apr 17 17:18:48.906560 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:48.906398 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="787b1cad-2a68-4df7-8ad9-3b51b346a563" path="/var/lib/kubelet/pods/787b1cad-2a68-4df7-8ad9-3b51b346a563/volumes" Apr 17 17:18:53.406447 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.406416 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-74d7bfb697-pczxp"] Apr 17 17:18:53.406883 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.406663 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-74d7bfb697-pczxp" podUID="76790239-9ac4-4ab5-9585-45554de53103" containerName="maas-api" containerID="cri-o://d9d643f3881c338acdb9172860ff846e415903343f00261936ad7aacd141b0c4" gracePeriod=30 Apr 17 17:18:53.411337 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.411297 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-74d7bfb697-pczxp" Apr 17 17:18:53.641396 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.641376 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-74d7bfb697-pczxp" Apr 17 17:18:53.727967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.727880 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch7kk\" (UniqueName: \"kubernetes.io/projected/76790239-9ac4-4ab5-9585-45554de53103-kube-api-access-ch7kk\") pod \"76790239-9ac4-4ab5-9585-45554de53103\" (UID: \"76790239-9ac4-4ab5-9585-45554de53103\") " Apr 17 17:18:53.727967 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.727924 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/76790239-9ac4-4ab5-9585-45554de53103-maas-api-tls\") pod \"76790239-9ac4-4ab5-9585-45554de53103\" (UID: \"76790239-9ac4-4ab5-9585-45554de53103\") " Apr 17 17:18:53.730149 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.730118 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76790239-9ac4-4ab5-9585-45554de53103-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "76790239-9ac4-4ab5-9585-45554de53103" (UID: "76790239-9ac4-4ab5-9585-45554de53103"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:18:53.730149 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.730114 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76790239-9ac4-4ab5-9585-45554de53103-kube-api-access-ch7kk" (OuterVolumeSpecName: "kube-api-access-ch7kk") pod "76790239-9ac4-4ab5-9585-45554de53103" (UID: "76790239-9ac4-4ab5-9585-45554de53103"). InnerVolumeSpecName "kube-api-access-ch7kk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:18:53.829079 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.829043 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ch7kk\" (UniqueName: \"kubernetes.io/projected/76790239-9ac4-4ab5-9585-45554de53103-kube-api-access-ch7kk\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:18:53.829079 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.829071 2568 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/76790239-9ac4-4ab5-9585-45554de53103-maas-api-tls\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:18:53.874386 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.874355 2568 generic.go:358] "Generic (PLEG): container finished" podID="76790239-9ac4-4ab5-9585-45554de53103" containerID="d9d643f3881c338acdb9172860ff846e415903343f00261936ad7aacd141b0c4" exitCode=0 Apr 17 17:18:53.874540 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.874395 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-74d7bfb697-pczxp" event={"ID":"76790239-9ac4-4ab5-9585-45554de53103","Type":"ContainerDied","Data":"d9d643f3881c338acdb9172860ff846e415903343f00261936ad7aacd141b0c4"} Apr 17 17:18:53.874540 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.874417 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-74d7bfb697-pczxp" event={"ID":"76790239-9ac4-4ab5-9585-45554de53103","Type":"ContainerDied","Data":"93605a6182ac3437f2601e1b5dbbf7883211859a98bc55fbf72924891c17b3df"} Apr 17 17:18:53.874540 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.874416 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-74d7bfb697-pczxp" Apr 17 17:18:53.874540 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.874441 2568 scope.go:117] "RemoveContainer" containerID="d9d643f3881c338acdb9172860ff846e415903343f00261936ad7aacd141b0c4" Apr 17 17:18:53.882022 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.881976 2568 scope.go:117] "RemoveContainer" containerID="d9d643f3881c338acdb9172860ff846e415903343f00261936ad7aacd141b0c4" Apr 17 17:18:53.882317 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:18:53.882280 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9d643f3881c338acdb9172860ff846e415903343f00261936ad7aacd141b0c4\": container with ID starting with d9d643f3881c338acdb9172860ff846e415903343f00261936ad7aacd141b0c4 not found: ID does not exist" containerID="d9d643f3881c338acdb9172860ff846e415903343f00261936ad7aacd141b0c4" Apr 17 17:18:53.882386 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.882331 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d643f3881c338acdb9172860ff846e415903343f00261936ad7aacd141b0c4"} err="failed to get container status \"d9d643f3881c338acdb9172860ff846e415903343f00261936ad7aacd141b0c4\": rpc error: code = NotFound desc = could not find container \"d9d643f3881c338acdb9172860ff846e415903343f00261936ad7aacd141b0c4\": container with ID starting with d9d643f3881c338acdb9172860ff846e415903343f00261936ad7aacd141b0c4 not found: ID does not exist" Apr 17 17:18:53.895858 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.895832 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-74d7bfb697-pczxp"] Apr 17 17:18:53.899744 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:53.899725 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-74d7bfb697-pczxp"] Apr 17 17:18:54.905927 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:18:54.905896 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76790239-9ac4-4ab5-9585-45554de53103" path="/var/lib/kubelet/pods/76790239-9ac4-4ab5-9585-45554de53103/volumes" Apr 17 17:19:20.521875 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.521831 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv"] Apr 17 17:19:20.522285 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.522136 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb54beea-fc14-40e0-9b4f-6dca2d9d4177" containerName="authorino" Apr 17 17:19:20.522285 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.522146 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb54beea-fc14-40e0-9b4f-6dca2d9d4177" containerName="authorino" Apr 17 17:19:20.522285 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.522158 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25ae0567-d16c-4054-98dd-fd48df455143" containerName="authorino" Apr 17 17:19:20.522285 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.522163 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ae0567-d16c-4054-98dd-fd48df455143" containerName="authorino" Apr 17 17:19:20.522285 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.522171 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76790239-9ac4-4ab5-9585-45554de53103" containerName="maas-api" Apr 17 17:19:20.522285 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.522177 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="76790239-9ac4-4ab5-9585-45554de53103" containerName="maas-api" Apr 17 17:19:20.522285 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.522218 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="76790239-9ac4-4ab5-9585-45554de53103" containerName="maas-api" Apr 17 17:19:20.522285 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.522228 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="25ae0567-d16c-4054-98dd-fd48df455143" containerName="authorino" Apr 17 17:19:20.522285 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.522234 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb54beea-fc14-40e0-9b4f-6dca2d9d4177" containerName="authorino" Apr 17 17:19:20.528533 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.528513 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.531498 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.531476 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 17:19:20.532641 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.532620 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-gnhc5\"" Apr 17 17:19:20.532929 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.532644 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 17:19:20.532929 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.532678 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 17 17:19:20.534833 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.534812 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv"] Apr 17 17:19:20.642392 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.642357 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d86e14f-974e-4808-b3b5-222c0d74a62b-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.642568 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.642413 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d86e14f-974e-4808-b3b5-222c0d74a62b-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.642568 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.642430 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3d86e14f-974e-4808-b3b5-222c0d74a62b-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.642568 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.642447 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d86e14f-974e-4808-b3b5-222c0d74a62b-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.642568 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.642510 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3d86e14f-974e-4808-b3b5-222c0d74a62b-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.642568 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.642555 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv8pm\" (UniqueName: \"kubernetes.io/projected/3d86e14f-974e-4808-b3b5-222c0d74a62b-kube-api-access-cv8pm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.743729 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.743700 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d86e14f-974e-4808-b3b5-222c0d74a62b-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.743882 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.743744 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d86e14f-974e-4808-b3b5-222c0d74a62b-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.743882 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.743773 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3d86e14f-974e-4808-b3b5-222c0d74a62b-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.743882 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.743797 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d86e14f-974e-4808-b3b5-222c0d74a62b-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.743882 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.743837 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3d86e14f-974e-4808-b3b5-222c0d74a62b-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.744031 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.743864 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cv8pm\" (UniqueName: \"kubernetes.io/projected/3d86e14f-974e-4808-b3b5-222c0d74a62b-kube-api-access-cv8pm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.744146 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.744118 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3d86e14f-974e-4808-b3b5-222c0d74a62b-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.744283 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.744183 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3d86e14f-974e-4808-b3b5-222c0d74a62b-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.744283 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.744255 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3d86e14f-974e-4808-b3b5-222c0d74a62b-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.746105 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.746087 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3d86e14f-974e-4808-b3b5-222c0d74a62b-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.746276 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.746259 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3d86e14f-974e-4808-b3b5-222c0d74a62b-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.751448 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.751427 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv8pm\" (UniqueName: \"kubernetes.io/projected/3d86e14f-974e-4808-b3b5-222c0d74a62b-kube-api-access-cv8pm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-k75rv\" (UID: \"3d86e14f-974e-4808-b3b5-222c0d74a62b\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.839673 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.839604 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:20.958027 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.957992 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv"] Apr 17 17:19:20.960652 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:19:20.960626 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d86e14f_974e_4808_b3b5_222c0d74a62b.slice/crio-5f966bea3a30bf86d8c01bece843f5c8ba4994e2a89aa46d849062b32497847b WatchSource:0}: Error finding container 5f966bea3a30bf86d8c01bece843f5c8ba4994e2a89aa46d849062b32497847b: Status 404 returned error can't find the container with id 5f966bea3a30bf86d8c01bece843f5c8ba4994e2a89aa46d849062b32497847b Apr 17 17:19:20.962436 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:20.962421 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:19:21.957910 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:21.957874 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" event={"ID":"3d86e14f-974e-4808-b3b5-222c0d74a62b","Type":"ContainerStarted","Data":"5f966bea3a30bf86d8c01bece843f5c8ba4994e2a89aa46d849062b32497847b"} Apr 17 17:19:27.979894 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:27.979857 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" event={"ID":"3d86e14f-974e-4808-b3b5-222c0d74a62b","Type":"ContainerStarted","Data":"161a763fe20aed8c3e7579b0f35046a32520fc9ffea432bab0c9cc799e8d41b3"} Apr 17 17:19:32.996533 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:32.996432 2568 generic.go:358] "Generic (PLEG): container finished" podID="3d86e14f-974e-4808-b3b5-222c0d74a62b" containerID="161a763fe20aed8c3e7579b0f35046a32520fc9ffea432bab0c9cc799e8d41b3" exitCode=0 Apr 17 17:19:32.996533 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:32.996501 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" event={"ID":"3d86e14f-974e-4808-b3b5-222c0d74a62b","Type":"ContainerDied","Data":"161a763fe20aed8c3e7579b0f35046a32520fc9ffea432bab0c9cc799e8d41b3"} Apr 17 17:19:35.005348 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:35.005292 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" event={"ID":"3d86e14f-974e-4808-b3b5-222c0d74a62b","Type":"ContainerStarted","Data":"39b42396b4df483a119c2ed127d5d1524ea5f5c0f77324d0e44bec462a7784fa"} Apr 17 17:19:35.005754 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:35.005544 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:35.024043 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:35.023976 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" podStartSLOduration=1.87121366 podStartE2EDuration="15.023957275s" podCreationTimestamp="2026-04-17 17:19:20 +0000 UTC" firstStartedPulling="2026-04-17 17:19:20.962541677 +0000 UTC m=+714.578674918" lastFinishedPulling="2026-04-17 17:19:34.11528529 +0000 UTC m=+727.731418533" observedRunningTime="2026-04-17 17:19:35.022254358 +0000 UTC m=+728.638387620" watchObservedRunningTime="2026-04-17 17:19:35.023957275 +0000 UTC m=+728.640090539" Apr 17 17:19:45.227784 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.227745 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd"] Apr 17 17:19:45.231207 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.231182 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.233807 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.233787 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 17 17:19:45.239980 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.239956 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd"] Apr 17 17:19:45.241565 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.241545 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f1cd75ab-978b-4336-bc68-f96bf0d8d177-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.241666 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.241580 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1cd75ab-978b-4336-bc68-f96bf0d8d177-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.241666 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.241623 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pgvv\" (UniqueName: \"kubernetes.io/projected/f1cd75ab-978b-4336-bc68-f96bf0d8d177-kube-api-access-4pgvv\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.241773 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.241701 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f1cd75ab-978b-4336-bc68-f96bf0d8d177-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.241773 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.241740 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1cd75ab-978b-4336-bc68-f96bf0d8d177-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.241879 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.241781 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f1cd75ab-978b-4336-bc68-f96bf0d8d177-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.342693 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.342659 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f1cd75ab-978b-4336-bc68-f96bf0d8d177-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.342886 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.342698 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1cd75ab-978b-4336-bc68-f96bf0d8d177-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.342886 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.342745 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pgvv\" (UniqueName: \"kubernetes.io/projected/f1cd75ab-978b-4336-bc68-f96bf0d8d177-kube-api-access-4pgvv\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.342886 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.342786 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f1cd75ab-978b-4336-bc68-f96bf0d8d177-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.342886 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.342809 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1cd75ab-978b-4336-bc68-f96bf0d8d177-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.342886 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.342842 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f1cd75ab-978b-4336-bc68-f96bf0d8d177-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.343199 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.343176 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f1cd75ab-978b-4336-bc68-f96bf0d8d177-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.343261 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.343183 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f1cd75ab-978b-4336-bc68-f96bf0d8d177-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.343326 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.343268 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1cd75ab-978b-4336-bc68-f96bf0d8d177-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.345048 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.345023 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f1cd75ab-978b-4336-bc68-f96bf0d8d177-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.345199 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.345180 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f1cd75ab-978b-4336-bc68-f96bf0d8d177-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.350841 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.350822 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pgvv\" (UniqueName: \"kubernetes.io/projected/f1cd75ab-978b-4336-bc68-f96bf0d8d177-kube-api-access-4pgvv\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd\" (UID: \"f1cd75ab-978b-4336-bc68-f96bf0d8d177\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.541874 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.541841 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:45.660217 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:45.660182 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd"] Apr 17 17:19:45.662869 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:19:45.662828 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1cd75ab_978b_4336_bc68_f96bf0d8d177.slice/crio-c384042b7d6a50a2c6d08dfab4cdeeab33e2fea3e67d6094ae3b7515c428bcc1 WatchSource:0}: Error finding container c384042b7d6a50a2c6d08dfab4cdeeab33e2fea3e67d6094ae3b7515c428bcc1: Status 404 returned error can't find the container with id c384042b7d6a50a2c6d08dfab4cdeeab33e2fea3e67d6094ae3b7515c428bcc1 Apr 17 17:19:46.025489 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:46.025453 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-k75rv" Apr 17 17:19:46.046733 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:46.046695 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" event={"ID":"f1cd75ab-978b-4336-bc68-f96bf0d8d177","Type":"ContainerStarted","Data":"43b47c64d3e382a0195d0235618e222b60767ce35411e894471f70af6afdd24a"} Apr 17 17:19:46.046733 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:46.046728 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" event={"ID":"f1cd75ab-978b-4336-bc68-f96bf0d8d177","Type":"ContainerStarted","Data":"c384042b7d6a50a2c6d08dfab4cdeeab33e2fea3e67d6094ae3b7515c428bcc1"} Apr 17 17:19:51.064864 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:51.064829 2568 generic.go:358] "Generic (PLEG): container finished" podID="f1cd75ab-978b-4336-bc68-f96bf0d8d177" containerID="43b47c64d3e382a0195d0235618e222b60767ce35411e894471f70af6afdd24a" exitCode=0 Apr 17 17:19:51.064864 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:51.064869 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" event={"ID":"f1cd75ab-978b-4336-bc68-f96bf0d8d177","Type":"ContainerDied","Data":"43b47c64d3e382a0195d0235618e222b60767ce35411e894471f70af6afdd24a"} Apr 17 17:19:52.070487 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:52.070450 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" event={"ID":"f1cd75ab-978b-4336-bc68-f96bf0d8d177","Type":"ContainerStarted","Data":"36e413ddbf02b0e20d3a59bf5a7dac5483378a3d50db6c66510a9de4cfc22304"} Apr 17 17:19:52.070918 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:52.070660 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:19:52.089821 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:19:52.089774 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" podStartSLOduration=6.852830652 podStartE2EDuration="7.089762414s" podCreationTimestamp="2026-04-17 17:19:45 +0000 UTC" firstStartedPulling="2026-04-17 17:19:51.065461243 +0000 UTC m=+744.681594488" lastFinishedPulling="2026-04-17 17:19:51.302392994 +0000 UTC m=+744.918526250" observedRunningTime="2026-04-17 17:19:52.08885467 +0000 UTC m=+745.704987944" watchObservedRunningTime="2026-04-17 17:19:52.089762414 +0000 UTC m=+745.705895677" Apr 17 17:20:03.087482 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:03.087453 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd" Apr 17 17:20:05.751860 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:05.751822 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz"] Apr 17 17:20:05.782255 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:05.782226 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz"] Apr 17 17:20:05.782418 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:05.782374 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:05.785721 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:05.785703 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 17 17:20:05.904853 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:05.904820 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:05.904853 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:05.904853 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:05.905099 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:05.904959 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:05.905099 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:05.904999 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:05.905099 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:05.905022 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:05.905099 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:05.905062 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bphl\" (UniqueName: \"kubernetes.io/projected/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-kube-api-access-6bphl\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:06.006291 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:06.006207 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:06.006291 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:06.006243 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:06.006291 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:06.006261 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:06.006291 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:06.006283 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bphl\" (UniqueName: \"kubernetes.io/projected/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-kube-api-access-6bphl\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:06.006607 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:06.006369 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:06.006607 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:06.006388 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:06.006757 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:06.006732 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:06.006828 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:06.006793 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:06.007016 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:06.006993 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:06.008714 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:06.008690 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:06.008805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:06.008793 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:06.014926 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:06.014903 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bphl\" (UniqueName: \"kubernetes.io/projected/e8c67d61-0ccd-4023-a9e1-1da69fb5bc30-kube-api-access-6bphl\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz\" (UID: \"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:06.092672 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:06.092649 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:06.212765 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:06.212741 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz"] Apr 17 17:20:06.214690 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:20:06.214658 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8c67d61_0ccd_4023_a9e1_1da69fb5bc30.slice/crio-83c0a67ff5b6bdfe9784035f18098b7f2c0257d1e29a0a4530f79e9624dbe065 WatchSource:0}: Error finding container 83c0a67ff5b6bdfe9784035f18098b7f2c0257d1e29a0a4530f79e9624dbe065: Status 404 returned error can't find the container with id 83c0a67ff5b6bdfe9784035f18098b7f2c0257d1e29a0a4530f79e9624dbe065 Apr 17 17:20:07.118557 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:07.118520 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" event={"ID":"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30","Type":"ContainerStarted","Data":"fb9142c4367854d8bc43692ec99a67dae8ca7aebdd4462712ed90f0d4c198b30"} Apr 17 17:20:07.118557 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:07.118559 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" event={"ID":"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30","Type":"ContainerStarted","Data":"83c0a67ff5b6bdfe9784035f18098b7f2c0257d1e29a0a4530f79e9624dbe065"} Apr 17 17:20:12.136419 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:12.136378 2568 generic.go:358] "Generic (PLEG): container finished" podID="e8c67d61-0ccd-4023-a9e1-1da69fb5bc30" containerID="fb9142c4367854d8bc43692ec99a67dae8ca7aebdd4462712ed90f0d4c198b30" exitCode=0 Apr 17 17:20:12.136776 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:12.136455 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" event={"ID":"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30","Type":"ContainerDied","Data":"fb9142c4367854d8bc43692ec99a67dae8ca7aebdd4462712ed90f0d4c198b30"} Apr 17 17:20:13.140671 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:13.140638 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" event={"ID":"e8c67d61-0ccd-4023-a9e1-1da69fb5bc30","Type":"ContainerStarted","Data":"ee881d697de5a1670858b20ea29fece54159fadac84538bc6f1a36d82d0c4674"} Apr 17 17:20:13.141054 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:13.140858 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:13.160376 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:13.160330 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" podStartSLOduration=7.953730147 podStartE2EDuration="8.160295224s" podCreationTimestamp="2026-04-17 17:20:05 +0000 UTC" firstStartedPulling="2026-04-17 17:20:12.137051844 +0000 UTC m=+765.753185085" lastFinishedPulling="2026-04-17 17:20:12.343616906 +0000 UTC m=+765.959750162" observedRunningTime="2026-04-17 17:20:13.158361694 +0000 UTC m=+766.774494956" watchObservedRunningTime="2026-04-17 17:20:13.160295224 +0000 UTC m=+766.776428486" Apr 17 17:20:24.156858 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:24.156831 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz" Apr 17 17:20:25.073254 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:25.073219 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-688cc84c7d-j699c"] Apr 17 17:20:25.077665 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:25.077644 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-688cc84c7d-j699c" Apr 17 17:20:25.083081 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:25.083055 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-688cc84c7d-j699c"] Apr 17 17:20:25.163198 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:25.163160 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/dc68b0bc-f70a-4d70-9605-8e339eb617cf-tls-cert\") pod \"authorino-688cc84c7d-j699c\" (UID: \"dc68b0bc-f70a-4d70-9605-8e339eb617cf\") " pod="kuadrant-system/authorino-688cc84c7d-j699c" Apr 17 17:20:25.163598 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:25.163219 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8856d\" (UniqueName: \"kubernetes.io/projected/dc68b0bc-f70a-4d70-9605-8e339eb617cf-kube-api-access-8856d\") pod \"authorino-688cc84c7d-j699c\" (UID: \"dc68b0bc-f70a-4d70-9605-8e339eb617cf\") " pod="kuadrant-system/authorino-688cc84c7d-j699c" Apr 17 17:20:25.263687 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:25.263644 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/dc68b0bc-f70a-4d70-9605-8e339eb617cf-tls-cert\") pod \"authorino-688cc84c7d-j699c\" (UID: \"dc68b0bc-f70a-4d70-9605-8e339eb617cf\") " pod="kuadrant-system/authorino-688cc84c7d-j699c" Apr 17 17:20:25.263867 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:25.263738 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8856d\" (UniqueName: \"kubernetes.io/projected/dc68b0bc-f70a-4d70-9605-8e339eb617cf-kube-api-access-8856d\") pod \"authorino-688cc84c7d-j699c\" (UID: \"dc68b0bc-f70a-4d70-9605-8e339eb617cf\") " pod="kuadrant-system/authorino-688cc84c7d-j699c" Apr 17 17:20:25.266159 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:25.266134 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/dc68b0bc-f70a-4d70-9605-8e339eb617cf-tls-cert\") pod \"authorino-688cc84c7d-j699c\" (UID: \"dc68b0bc-f70a-4d70-9605-8e339eb617cf\") " pod="kuadrant-system/authorino-688cc84c7d-j699c" Apr 17 17:20:25.273902 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:25.273881 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8856d\" (UniqueName: \"kubernetes.io/projected/dc68b0bc-f70a-4d70-9605-8e339eb617cf-kube-api-access-8856d\") pod \"authorino-688cc84c7d-j699c\" (UID: \"dc68b0bc-f70a-4d70-9605-8e339eb617cf\") " pod="kuadrant-system/authorino-688cc84c7d-j699c" Apr 17 17:20:25.388401 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:25.388294 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-688cc84c7d-j699c" Apr 17 17:20:25.505764 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:25.505579 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-688cc84c7d-j699c"] Apr 17 17:20:25.508027 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:20:25.508002 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc68b0bc_f70a_4d70_9605_8e339eb617cf.slice/crio-67c7ab71be70a08c039d7549734528a7d1f2f13b6c3046c2128014a2ef433511 WatchSource:0}: Error finding container 67c7ab71be70a08c039d7549734528a7d1f2f13b6c3046c2128014a2ef433511: Status 404 returned error can't find the container with id 67c7ab71be70a08c039d7549734528a7d1f2f13b6c3046c2128014a2ef433511 Apr 17 17:20:26.183262 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:26.183172 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-688cc84c7d-j699c" event={"ID":"dc68b0bc-f70a-4d70-9605-8e339eb617cf","Type":"ContainerStarted","Data":"51f028417927a7879f7153d1f6db6870a7d0e5001601f771ee65d18006d313d6"} Apr 17 17:20:26.183262 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:26.183213 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-688cc84c7d-j699c" event={"ID":"dc68b0bc-f70a-4d70-9605-8e339eb617cf","Type":"ContainerStarted","Data":"67c7ab71be70a08c039d7549734528a7d1f2f13b6c3046c2128014a2ef433511"} Apr 17 17:20:26.206073 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:26.206019 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-688cc84c7d-j699c" podStartSLOduration=0.83771779 podStartE2EDuration="1.206004205s" podCreationTimestamp="2026-04-17 17:20:25 +0000 UTC" firstStartedPulling="2026-04-17 17:20:25.509681024 +0000 UTC m=+779.125814264" lastFinishedPulling="2026-04-17 17:20:25.877967434 +0000 UTC m=+779.494100679" observedRunningTime="2026-04-17 17:20:26.205578141 +0000 UTC m=+779.821711404" watchObservedRunningTime="2026-04-17 17:20:26.206004205 +0000 UTC m=+779.822137468" Apr 17 17:20:26.251613 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:26.251580 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-798cc5f5dc-7pf9x"] Apr 17 17:20:26.251848 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:26.251822 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-798cc5f5dc-7pf9x" podUID="59a9adf1-0757-419c-8b6a-bdcbb28b1e34" containerName="authorino" containerID="cri-o://c9cb7f0d99c10b643a963ab2dd843b618c9c44ba271214de178983815baac51c" gracePeriod=30 Apr 17 17:20:26.500730 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:26.500706 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-798cc5f5dc-7pf9x" Apr 17 17:20:26.574261 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:26.574221 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/59a9adf1-0757-419c-8b6a-bdcbb28b1e34-tls-cert\") pod \"59a9adf1-0757-419c-8b6a-bdcbb28b1e34\" (UID: \"59a9adf1-0757-419c-8b6a-bdcbb28b1e34\") " Apr 17 17:20:26.574488 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:26.574298 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9jwm\" (UniqueName: \"kubernetes.io/projected/59a9adf1-0757-419c-8b6a-bdcbb28b1e34-kube-api-access-n9jwm\") pod \"59a9adf1-0757-419c-8b6a-bdcbb28b1e34\" (UID: \"59a9adf1-0757-419c-8b6a-bdcbb28b1e34\") " Apr 17 17:20:26.576365 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:26.576325 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59a9adf1-0757-419c-8b6a-bdcbb28b1e34-kube-api-access-n9jwm" (OuterVolumeSpecName: "kube-api-access-n9jwm") pod "59a9adf1-0757-419c-8b6a-bdcbb28b1e34" (UID: "59a9adf1-0757-419c-8b6a-bdcbb28b1e34"). InnerVolumeSpecName "kube-api-access-n9jwm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:20:26.583690 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:26.583663 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59a9adf1-0757-419c-8b6a-bdcbb28b1e34-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "59a9adf1-0757-419c-8b6a-bdcbb28b1e34" (UID: "59a9adf1-0757-419c-8b6a-bdcbb28b1e34"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:20:26.675187 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:26.675146 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n9jwm\" (UniqueName: \"kubernetes.io/projected/59a9adf1-0757-419c-8b6a-bdcbb28b1e34-kube-api-access-n9jwm\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:20:26.675187 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:26.675180 2568 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/59a9adf1-0757-419c-8b6a-bdcbb28b1e34-tls-cert\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 17 17:20:27.187843 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:27.187754 2568 generic.go:358] "Generic (PLEG): container finished" podID="59a9adf1-0757-419c-8b6a-bdcbb28b1e34" containerID="c9cb7f0d99c10b643a963ab2dd843b618c9c44ba271214de178983815baac51c" exitCode=0 Apr 17 17:20:27.188266 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:27.187838 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-798cc5f5dc-7pf9x" Apr 17 17:20:27.188266 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:27.187844 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-798cc5f5dc-7pf9x" event={"ID":"59a9adf1-0757-419c-8b6a-bdcbb28b1e34","Type":"ContainerDied","Data":"c9cb7f0d99c10b643a963ab2dd843b618c9c44ba271214de178983815baac51c"} Apr 17 17:20:27.188266 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:27.187884 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-798cc5f5dc-7pf9x" event={"ID":"59a9adf1-0757-419c-8b6a-bdcbb28b1e34","Type":"ContainerDied","Data":"4d962c46e2748d7f86f5617ea11406d1a5110fce79ba6d088e8d89c32a80420e"} Apr 17 17:20:27.188266 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:27.187901 2568 scope.go:117] "RemoveContainer" containerID="c9cb7f0d99c10b643a963ab2dd843b618c9c44ba271214de178983815baac51c" Apr 17 17:20:27.195911 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:27.195898 2568 scope.go:117] "RemoveContainer" containerID="c9cb7f0d99c10b643a963ab2dd843b618c9c44ba271214de178983815baac51c" Apr 17 17:20:27.196182 ip-10-0-134-244 kubenswrapper[2568]: E0417 17:20:27.196163 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9cb7f0d99c10b643a963ab2dd843b618c9c44ba271214de178983815baac51c\": container with ID starting with c9cb7f0d99c10b643a963ab2dd843b618c9c44ba271214de178983815baac51c not found: ID does not exist" containerID="c9cb7f0d99c10b643a963ab2dd843b618c9c44ba271214de178983815baac51c" Apr 17 17:20:27.196235 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:27.196191 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9cb7f0d99c10b643a963ab2dd843b618c9c44ba271214de178983815baac51c"} err="failed to get container status \"c9cb7f0d99c10b643a963ab2dd843b618c9c44ba271214de178983815baac51c\": rpc error: code = NotFound desc = could not find container \"c9cb7f0d99c10b643a963ab2dd843b618c9c44ba271214de178983815baac51c\": container with ID starting with c9cb7f0d99c10b643a963ab2dd843b618c9c44ba271214de178983815baac51c not found: ID does not exist" Apr 17 17:20:27.203298 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:27.203274 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-798cc5f5dc-7pf9x"] Apr 17 17:20:27.207149 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:27.207128 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-798cc5f5dc-7pf9x"] Apr 17 17:20:28.905493 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:20:28.905456 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59a9adf1-0757-419c-8b6a-bdcbb28b1e34" path="/var/lib/kubelet/pods/59a9adf1-0757-419c-8b6a-bdcbb28b1e34/volumes" Apr 17 17:22:26.840576 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:22:26.840550 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:22:26.841978 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:22:26.841957 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:27:26.860117 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:27:26.860089 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:27:26.862880 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:27:26.862863 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:32:26.879919 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:32:26.879815 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:32:26.883149 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:32:26.883133 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:37:26.907448 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:37:26.907346 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:37:26.914110 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:37:26.910896 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:42:26.932577 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:42:26.932472 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:42:26.936334 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:42:26.936316 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:42:45.341121 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:42:45.341089 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-688cc84c7d-j699c_dc68b0bc-f70a-4d70-9605-8e339eb617cf/authorino/0.log" Apr 17 17:42:49.637597 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:42:49.637563 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6569445fb5-n5xbj_1bfd898b-d4fd-435e-9577-faa2d70a9933/manager/0.log" Apr 17 17:42:49.864196 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:42:49.864167 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-qjzlx_1e259cda-389b-4038-8c99-8b966fbdd99c/postgres/0.log" Apr 17 17:42:51.104175 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:42:51.104146 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-688cc84c7d-j699c_dc68b0bc-f70a-4d70-9605-8e339eb617cf/authorino/0.log" Apr 17 17:42:51.220340 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:42:51.220295 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-t5j8c_1e23ac42-c082-4abc-9a28-357b80204db7/manager/0.log" Apr 17 17:42:51.556609 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:42:51.556568 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-bbhzb_d960ab89-1946-42bc-88b4-1e589e03a8c1/registry-server/0.log" Apr 17 17:42:52.249425 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:42:52.249395 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f6bcz6_5a3739be-a28c-4dd1-a517-ec86595a6822/istio-proxy/0.log" Apr 17 17:42:52.704707 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:42:52.704678 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-65pp9_e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d/istio-proxy/0.log" Apr 17 17:42:53.269314 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:42:53.269270 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz_e8c67d61-0ccd-4023-a9e1-1da69fb5bc30/storage-initializer/0.log" Apr 17 17:42:53.277820 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:42:53.277779 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-w9kgz_e8c67d61-0ccd-4023-a9e1-1da69fb5bc30/main/0.log" Apr 17 17:42:53.386206 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:42:53.386169 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-k75rv_3d86e14f-974e-4808-b3b5-222c0d74a62b/storage-initializer/0.log" Apr 17 17:42:53.394275 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:42:53.394248 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-k75rv_3d86e14f-974e-4808-b3b5-222c0d74a62b/main/0.log" Apr 17 17:42:53.634512 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:42:53.634439 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd_f1cd75ab-978b-4336-bc68-f96bf0d8d177/main/0.log" Apr 17 17:42:53.648805 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:42:53.648772 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-6zlsd_f1cd75ab-978b-4336-bc68-f96bf0d8d177/storage-initializer/0.log" Apr 17 17:43:00.030173 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:00.030143 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gfbn9_fcbb42f0-8a82-4eb2-8ab6-dca288c60dc0/global-pull-secret-syncer/0.log" Apr 17 17:43:00.184110 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:00.184078 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-n49j7_31497e40-1e80-4723-9e91-a3e5cfedce92/konnectivity-agent/0.log" Apr 17 17:43:00.257322 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:00.257272 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-244.ec2.internal_009f099669c612b1a9a7e8809b1d3526/haproxy/0.log" Apr 17 17:43:04.397753 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:04.397717 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-688cc84c7d-j699c_dc68b0bc-f70a-4d70-9605-8e339eb617cf/authorino/0.log" Apr 17 17:43:04.429762 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:04.429733 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-t5j8c_1e23ac42-c082-4abc-9a28-357b80204db7/manager/0.log" Apr 17 17:43:04.554015 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:04.553981 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-bbhzb_d960ab89-1946-42bc-88b4-1e589e03a8c1/registry-server/0.log" Apr 17 17:43:06.688769 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:06.688735 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qppqk_02d2dd0a-d826-404c-9b25-1599d2485324/node-exporter/0.log" Apr 17 17:43:06.706360 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:06.706332 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qppqk_02d2dd0a-d826-404c-9b25-1599d2485324/kube-rbac-proxy/0.log" Apr 17 17:43:06.724743 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:06.724725 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qppqk_02d2dd0a-d826-404c-9b25-1599d2485324/init-textfile/0.log" Apr 17 17:43:08.196994 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.196922 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z"] Apr 17 17:43:08.197373 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.197217 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59a9adf1-0757-419c-8b6a-bdcbb28b1e34" containerName="authorino" Apr 17 17:43:08.197373 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.197227 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="59a9adf1-0757-419c-8b6a-bdcbb28b1e34" containerName="authorino" Apr 17 17:43:08.197373 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.197273 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="59a9adf1-0757-419c-8b6a-bdcbb28b1e34" containerName="authorino" Apr 17 17:43:08.199606 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.199590 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:08.202050 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.202028 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2ssms\"/\"kube-root-ca.crt\"" Apr 17 17:43:08.203044 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.203025 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2ssms\"/\"default-dockercfg-l4kdb\"" Apr 17 17:43:08.203167 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.203025 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2ssms\"/\"openshift-service-ca.crt\"" Apr 17 17:43:08.209550 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.209526 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z"] Apr 17 17:43:08.278422 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.278393 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/47f8fe25-9529-48d4-95e7-a1c2dd7fad9c-podres\") pod \"perf-node-gather-daemonset-mrk2z\" (UID: \"47f8fe25-9529-48d4-95e7-a1c2dd7fad9c\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:08.278568 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.278429 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47f8fe25-9529-48d4-95e7-a1c2dd7fad9c-lib-modules\") pod \"perf-node-gather-daemonset-mrk2z\" (UID: \"47f8fe25-9529-48d4-95e7-a1c2dd7fad9c\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:08.278568 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.278495 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/47f8fe25-9529-48d4-95e7-a1c2dd7fad9c-proc\") pod \"perf-node-gather-daemonset-mrk2z\" (UID: \"47f8fe25-9529-48d4-95e7-a1c2dd7fad9c\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:08.278643 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.278565 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47f8fe25-9529-48d4-95e7-a1c2dd7fad9c-sys\") pod \"perf-node-gather-daemonset-mrk2z\" (UID: \"47f8fe25-9529-48d4-95e7-a1c2dd7fad9c\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:08.278643 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.278584 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fblh7\" (UniqueName: \"kubernetes.io/projected/47f8fe25-9529-48d4-95e7-a1c2dd7fad9c-kube-api-access-fblh7\") pod \"perf-node-gather-daemonset-mrk2z\" (UID: \"47f8fe25-9529-48d4-95e7-a1c2dd7fad9c\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:08.379190 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.379157 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47f8fe25-9529-48d4-95e7-a1c2dd7fad9c-sys\") pod \"perf-node-gather-daemonset-mrk2z\" (UID: \"47f8fe25-9529-48d4-95e7-a1c2dd7fad9c\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:08.379190 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.379176 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47f8fe25-9529-48d4-95e7-a1c2dd7fad9c-sys\") pod \"perf-node-gather-daemonset-mrk2z\" (UID: \"47f8fe25-9529-48d4-95e7-a1c2dd7fad9c\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:08.379424 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.379212 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fblh7\" (UniqueName: \"kubernetes.io/projected/47f8fe25-9529-48d4-95e7-a1c2dd7fad9c-kube-api-access-fblh7\") pod \"perf-node-gather-daemonset-mrk2z\" (UID: \"47f8fe25-9529-48d4-95e7-a1c2dd7fad9c\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:08.379424 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.379276 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/47f8fe25-9529-48d4-95e7-a1c2dd7fad9c-podres\") pod \"perf-node-gather-daemonset-mrk2z\" (UID: \"47f8fe25-9529-48d4-95e7-a1c2dd7fad9c\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:08.379424 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.379327 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47f8fe25-9529-48d4-95e7-a1c2dd7fad9c-lib-modules\") pod \"perf-node-gather-daemonset-mrk2z\" (UID: \"47f8fe25-9529-48d4-95e7-a1c2dd7fad9c\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:08.379424 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.379369 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/47f8fe25-9529-48d4-95e7-a1c2dd7fad9c-proc\") pod \"perf-node-gather-daemonset-mrk2z\" (UID: \"47f8fe25-9529-48d4-95e7-a1c2dd7fad9c\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:08.379594 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.379428 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/47f8fe25-9529-48d4-95e7-a1c2dd7fad9c-podres\") pod \"perf-node-gather-daemonset-mrk2z\" (UID: \"47f8fe25-9529-48d4-95e7-a1c2dd7fad9c\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:08.379594 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.379474 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/47f8fe25-9529-48d4-95e7-a1c2dd7fad9c-proc\") pod \"perf-node-gather-daemonset-mrk2z\" (UID: \"47f8fe25-9529-48d4-95e7-a1c2dd7fad9c\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:08.379594 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.379494 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47f8fe25-9529-48d4-95e7-a1c2dd7fad9c-lib-modules\") pod \"perf-node-gather-daemonset-mrk2z\" (UID: \"47f8fe25-9529-48d4-95e7-a1c2dd7fad9c\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:08.388420 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.388389 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fblh7\" (UniqueName: \"kubernetes.io/projected/47f8fe25-9529-48d4-95e7-a1c2dd7fad9c-kube-api-access-fblh7\") pod \"perf-node-gather-daemonset-mrk2z\" (UID: \"47f8fe25-9529-48d4-95e7-a1c2dd7fad9c\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:08.509556 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.509469 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:08.626334 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.626285 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z"] Apr 17 17:43:08.629300 ip-10-0-134-244 kubenswrapper[2568]: W0417 17:43:08.629268 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod47f8fe25_9529_48d4_95e7_a1c2dd7fad9c.slice/crio-3ff7e2edbda2b553a923e742670ce3b1560b077be2246c608f913134033bc489 WatchSource:0}: Error finding container 3ff7e2edbda2b553a923e742670ce3b1560b077be2246c608f913134033bc489: Status 404 returned error can't find the container with id 3ff7e2edbda2b553a923e742670ce3b1560b077be2246c608f913134033bc489 Apr 17 17:43:08.631176 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:08.631160 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:43:09.431370 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:09.431332 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" event={"ID":"47f8fe25-9529-48d4-95e7-a1c2dd7fad9c","Type":"ContainerStarted","Data":"627095aab9b8d080472027273c990b879fdae6eb9b5230e3a69033de336712e7"} Apr 17 17:43:09.431370 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:09.431369 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" event={"ID":"47f8fe25-9529-48d4-95e7-a1c2dd7fad9c","Type":"ContainerStarted","Data":"3ff7e2edbda2b553a923e742670ce3b1560b077be2246c608f913134033bc489"} Apr 17 17:43:09.431769 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:09.431456 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:09.449373 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:09.449335 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" podStartSLOduration=1.44932396 podStartE2EDuration="1.44932396s" podCreationTimestamp="2026-04-17 17:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:43:09.447638198 +0000 UTC m=+2143.063771458" watchObservedRunningTime="2026-04-17 17:43:09.44932396 +0000 UTC m=+2143.065457237" Apr 17 17:43:10.457141 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:10.457119 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6b8st_d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897/dns/0.log" Apr 17 17:43:10.477935 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:10.477907 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6b8st_d86d134d-5cb6-4c1b-8ec9-0bc4aa4b3897/kube-rbac-proxy/0.log" Apr 17 17:43:10.592701 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:10.592676 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4vjjz_fa447df9-716a-47c2-9ffb-b819a566f787/dns-node-resolver/0.log" Apr 17 17:43:11.134157 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:11.134128 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j5pdx_3868d9bd-d06e-4b37-89c0-ab0c05df3fff/node-ca/0.log" Apr 17 17:43:11.962421 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:11.962390 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f6bcz6_5a3739be-a28c-4dd1-a517-ec86595a6822/istio-proxy/0.log" Apr 17 17:43:12.275431 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:12.275399 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-65pp9_e07126fb-0a4d-4f34-9ab0-5c2c5aa8352d/istio-proxy/0.log" Apr 17 17:43:12.830568 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:12.830540 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-whlxq_da1e7104-4977-42cd-80d8-b8775e3a717b/serve-healthcheck-canary/0.log" Apr 17 17:43:13.251620 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:13.251539 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qb4fz_3ae13ab2-ef52-4063-a1b8-20d01ad15774/kube-rbac-proxy/0.log" Apr 17 17:43:13.271172 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:13.271144 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qb4fz_3ae13ab2-ef52-4063-a1b8-20d01ad15774/exporter/0.log" Apr 17 17:43:13.291486 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:13.291459 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qb4fz_3ae13ab2-ef52-4063-a1b8-20d01ad15774/extractor/0.log" Apr 17 17:43:15.443885 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:15.443856 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-mrk2z" Apr 17 17:43:15.562500 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:15.562468 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6569445fb5-n5xbj_1bfd898b-d4fd-435e-9577-faa2d70a9933/manager/0.log" Apr 17 17:43:15.604437 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:15.604409 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-qjzlx_1e259cda-389b-4038-8c99-8b966fbdd99c/postgres/0.log" Apr 17 17:43:16.745246 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:16.745211 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5448568df4-j57ld_648b1e6b-1be5-42e9-aa28-770c26cc4bb9/manager/0.log" Apr 17 17:43:22.499374 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:22.499345 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b66zf_8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2/kube-multus-additional-cni-plugins/0.log" Apr 17 17:43:22.525896 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:22.525867 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b66zf_8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2/egress-router-binary-copy/0.log" Apr 17 17:43:22.552862 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:22.552837 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b66zf_8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2/cni-plugins/0.log" Apr 17 17:43:22.575883 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:22.575860 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b66zf_8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2/bond-cni-plugin/0.log" Apr 17 17:43:22.598100 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:22.598073 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b66zf_8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2/routeoverride-cni/0.log" Apr 17 17:43:22.624383 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:22.624347 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b66zf_8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2/whereabouts-cni-bincopy/0.log" Apr 17 17:43:22.651273 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:22.651255 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b66zf_8052c1d7-4ccd-4aee-a77b-dc2cf3e75eb2/whereabouts-cni/0.log" Apr 17 17:43:22.909595 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:22.909563 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v68m9_3a952143-3965-4339-ba83-4a96e9b34841/kube-multus/0.log" Apr 17 17:43:22.932953 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:22.932926 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4xcb9_62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89/network-metrics-daemon/0.log" Apr 17 17:43:22.951451 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:22.951421 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4xcb9_62e98ec0-c09a-4b6f-8e6e-4f9c9896ab89/kube-rbac-proxy/0.log" Apr 17 17:43:24.221182 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:24.221124 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-controller/0.log" Apr 17 17:43:24.246036 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:24.246009 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/0.log" Apr 17 17:43:24.264661 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:24.264632 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovn-acl-logging/1.log" Apr 17 17:43:24.286762 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:24.286732 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/kube-rbac-proxy-node/0.log" Apr 17 17:43:24.314891 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:24.314863 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 17:43:24.336790 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:24.336757 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/northd/0.log" Apr 17 17:43:24.358148 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:24.358108 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/nbdb/0.log" Apr 17 17:43:24.379209 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:24.379185 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/sbdb/0.log" Apr 17 17:43:24.536051 ip-10-0-134-244 kubenswrapper[2568]: I0417 17:43:24.535983 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-97rxf_d848a18d-f010-4ec0-898d-c9d149265ab6/ovnkube-controller/0.log"