Apr 22 19:03:41.741860 ip-10-0-141-191 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 19:03:41.741868 ip-10-0-141-191 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 19:03:41.741875 ip-10-0-141-191 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 19:03:41.742085 ip-10-0-141-191 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 19:03:51.884922 ip-10-0-141-191 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 19:03:51.884943 ip-10-0-141-191 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 9f23d25c69f24a379319b73c99de0467 -- Apr 22 19:06:15.196851 ip-10-0-141-191 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:06:15.656634 ip-10-0-141-191 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:06:15.656634 ip-10-0-141-191 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:06:15.656634 ip-10-0-141-191 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:06:15.656634 ip-10-0-141-191 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:06:15.656634 ip-10-0-141-191 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:06:15.658986 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.658902 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:06:15.661312 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661291 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:06:15.661312 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661308 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:06:15.661312 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661313 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:06:15.661312 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661317 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661324 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661330 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661335 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661340 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661344 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661349 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661354 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661358 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661362 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661365 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661369 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661373 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661385 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661389 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661393 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661397 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661401 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661405 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:06:15.661532 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661409 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661413 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661417 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661421 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661425 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661429 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661433 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661437 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661442 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661446 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661450 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661453 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661457 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661462 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661474 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661478 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661483 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661487 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661491 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661494 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:06:15.662326 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661498 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661502 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661507 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661511 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661515 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661519 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661524 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661528 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661531 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661535 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661539 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661542 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661562 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661567 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661571 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661575 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661578 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661583 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661587 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661591 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:06:15.663053 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661596 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661600 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661604 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661608 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661613 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661617 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661621 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661625 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661630 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661636 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661648 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661652 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661656 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661660 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661664 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661669 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661676 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661682 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661689 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661694 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:06:15.663627 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661698 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661702 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661706 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.661710 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662299 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662307 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662311 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662315 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662319 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662323 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662327 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662331 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662335 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662341 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662345 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662349 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662353 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662358 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662362 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662367 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:06:15.664453 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662371 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662376 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662380 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662384 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662388 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662392 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662398 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662403 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662407 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662412 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662416 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662420 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662424 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662429 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662433 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662436 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662441 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662445 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662449 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662453 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:06:15.665136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662457 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662461 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662465 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662470 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662475 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662479 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662484 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662488 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662492 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662497 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662501 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662507 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662511 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662517 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662523 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662529 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662533 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662537 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662542 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:06:15.665678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662563 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662568 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662572 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662576 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662580 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662584 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662588 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662592 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662596 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662600 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662604 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662608 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662614 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662618 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662622 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662626 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662631 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662636 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662640 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662644 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:06:15.666268 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662648 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662652 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662655 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662660 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662665 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662669 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662673 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662677 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662682 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662686 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.662690 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663441 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663456 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663466 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663473 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663480 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663486 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663493 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663500 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663505 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663510 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:06:15.666939 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663515 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663521 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663527 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663531 2576 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663536 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663541 2576 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663564 2576 flags.go:64] FLAG: --cloud-config="" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663569 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663573 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663581 2576 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663585 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663590 2576 flags.go:64] FLAG: --config-dir="" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663595 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663600 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663606 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663613 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663620 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663625 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663630 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663635 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663639 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663644 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663649 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663656 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663661 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:06:15.667440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663666 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663670 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663675 2576 flags.go:64] FLAG: --enable-server="true" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663680 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663687 2576 flags.go:64] FLAG: --event-burst="100" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663692 2576 flags.go:64] FLAG: --event-qps="50" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663697 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663701 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663707 2576 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663713 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663718 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663723 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663728 2576 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663733 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663737 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663742 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663747 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663753 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663757 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663762 2576 flags.go:64] FLAG: --feature-gates="" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663768 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663773 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663778 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663784 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663789 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:06:15.668134 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663794 2576 flags.go:64] FLAG: --help="false" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663799 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-141-191.ec2.internal" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663804 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663809 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663813 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663819 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663824 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663829 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663834 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663839 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663843 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663848 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663853 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663858 2576 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663863 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663868 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663874 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663879 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663884 2576 flags.go:64] FLAG: --lock-file="" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663888 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663893 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663898 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663907 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:06:15.668746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663912 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663917 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663922 2576 flags.go:64] FLAG: --logging-format="text" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663927 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663932 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663936 2576 flags.go:64] FLAG: --manifest-url="" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663941 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663948 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663954 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663960 2576 flags.go:64] FLAG: --max-pods="110" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663965 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663970 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663974 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663979 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663984 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663989 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.663994 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664005 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664010 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664015 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664020 2576 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664025 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664033 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664038 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:06:15.669291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664052 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664057 2576 flags.go:64] FLAG: --port="10250" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664062 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664067 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-043311aedd43d3fd2" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664072 2576 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664077 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664081 2576 flags.go:64] FLAG: --register-node="true" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664086 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664090 2576 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664097 2576 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664102 2576 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664107 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664111 2576 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664117 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664122 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664126 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664131 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664137 2576 flags.go:64] FLAG: --runonce="false" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664142 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664147 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664152 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664157 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664162 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664167 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664171 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664176 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:06:15.669904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664181 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664186 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664190 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664195 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664200 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664205 2576 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664212 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664221 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664227 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664232 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664238 2576 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664243 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664248 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664252 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664257 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664262 2576 flags.go:64] FLAG: --v="2" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664269 2576 flags.go:64] FLAG: --version="false" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664276 2576 flags.go:64] FLAG: --vmodule="" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664282 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.664287 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664462 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664469 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664474 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664479 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:06:15.670572 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664484 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664491 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664495 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664501 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664506 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664510 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664514 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664518 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664522 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664526 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664530 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664534 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664538 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664542 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664566 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664570 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664574 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664578 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664582 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664587 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:06:15.671163 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664590 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664594 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664598 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664602 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664606 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664611 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664615 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664619 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664623 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664627 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664631 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664635 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664640 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664645 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664649 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664655 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664660 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664664 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664669 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664673 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:06:15.671687 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664677 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664681 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664685 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664689 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664693 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664697 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664704 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664709 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664713 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664717 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664722 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664726 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664730 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664734 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664738 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664742 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664746 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664750 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664754 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:06:15.672212 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664759 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664762 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664767 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664771 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664775 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664779 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664783 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664787 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664791 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664796 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664800 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664804 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664808 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664812 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664816 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664820 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664824 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664831 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664836 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:06:15.672698 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664844 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:06:15.673153 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664848 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:06:15.673153 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664853 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:06:15.673153 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.664858 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:06:15.673153 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.665572 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:06:15.673153 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.671861 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:06:15.673153 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.671879 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:06:15.673153 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671923 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:06:15.673153 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671928 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:06:15.673153 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671932 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:06:15.673153 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671935 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:06:15.673153 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671938 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:06:15.673153 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671940 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:06:15.673153 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671943 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:06:15.673153 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671946 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:06:15.673153 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671949 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671952 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671954 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671957 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671960 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671966 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671969 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671972 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671974 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671977 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671979 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671982 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671984 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671987 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671989 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671992 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671994 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.671997 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672000 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672003 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:06:15.673527 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672005 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672008 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672012 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672016 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672019 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672022 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672025 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672027 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672030 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672033 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672036 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672038 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672041 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672043 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672046 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672049 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672052 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672054 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672057 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672060 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:06:15.674040 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672063 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672065 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672068 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672071 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672073 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672076 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672078 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672081 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672083 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672086 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672089 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672091 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672094 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672097 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672099 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672102 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672105 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672108 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672111 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672113 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:06:15.674525 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672115 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:06:15.675028 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672118 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:06:15.675028 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672120 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:06:15.675028 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672123 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:06:15.675028 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672126 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:06:15.675028 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672128 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:06:15.675028 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672130 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:06:15.675028 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672133 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:06:15.675028 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672136 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:06:15.675028 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672138 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:06:15.675028 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672141 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:06:15.675028 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672144 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:06:15.675028 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672146 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:06:15.675028 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672149 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:06:15.675028 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672152 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:06:15.675028 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672156 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:06:15.675028 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672160 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:06:15.675028 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672163 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:06:15.675507 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.672168 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:06:15.675507 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672275 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:06:15.675507 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672280 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:06:15.675507 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672283 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:06:15.675507 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672288 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:06:15.675507 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672291 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:06:15.675507 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672294 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:06:15.675507 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672297 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:06:15.675507 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672301 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:06:15.675507 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672304 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:06:15.675507 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672306 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:06:15.675507 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672309 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:06:15.675507 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672312 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:06:15.675507 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672314 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:06:15.675507 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672317 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672320 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672322 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672325 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672327 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672330 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672333 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672335 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672337 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672340 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672342 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672345 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672348 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672351 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672354 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672356 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672359 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672361 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672364 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672367 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:06:15.675911 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672369 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672372 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672375 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672377 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672380 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672382 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672385 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672387 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672390 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672393 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672395 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672398 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672400 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672403 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672405 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672408 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672411 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672413 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672416 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672418 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:06:15.676428 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672421 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672423 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672426 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672428 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672430 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672433 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672436 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672438 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672441 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672443 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672446 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672448 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672451 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672453 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672456 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672458 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672461 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672463 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672467 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:06:15.676930 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672471 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:06:15.677468 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672473 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:06:15.677468 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672476 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:06:15.677468 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672478 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:06:15.677468 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672481 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:06:15.677468 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672483 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:06:15.677468 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672486 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:06:15.677468 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672488 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:06:15.677468 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672491 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:06:15.677468 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672494 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:06:15.677468 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672496 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:06:15.677468 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672499 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:06:15.677468 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672501 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:06:15.677468 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:15.672504 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:06:15.677468 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.672508 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:06:15.677468 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.673234 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:06:15.677856 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.675144 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:06:15.677856 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.676130 2576 server.go:1019] "Starting client certificate rotation" Apr 22 19:06:15.677856 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.676224 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:06:15.677856 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.676265 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:06:15.699352 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.699333 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:06:15.704617 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.704591 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:06:15.711759 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.711741 2576 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:06:15.717198 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.717184 2576 log.go:25] "Validated CRI v1 image API" Apr 22 19:06:15.718311 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.718285 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:06:15.720710 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.720689 2576 fs.go:135] Filesystem UUIDs: map[06c6fd60-62b7-4b4e-8c4e-c701992a06cc:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 fcfad8ba-66ef-4982-b6a6-cbb93a9e407e:/dev/nvme0n1p3] Apr 22 19:06:15.720779 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.720709 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:06:15.725912 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.725894 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:06:15.726032 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.725868 2576 manager.go:217] Machine: {Timestamp:2026-04-22 19:06:15.724286303 +0000 UTC m=+0.404764152 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3079486 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2250c65f437dfe27e86960410b9ae9 SystemUUID:ec2250c6-5f43-7dfe-27e8-6960410b9ae9 BootID:9f23d25c-69f2-4a37-9319-b73c99de0467 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:01:62:1c:5d:3d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:01:62:1c:5d:3d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a6:16:be:a0:c5:61 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:06:15.726118 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.726036 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:06:15.726167 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.726140 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:06:15.726486 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.726459 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:06:15.726676 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.726487 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-191.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:06:15.726763 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.726685 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:06:15.726763 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.726698 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:06:15.726763 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.726740 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:06:15.728083 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.728072 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:06:15.728937 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.728925 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:06:15.729055 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.729044 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:06:15.732772 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.732760 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:06:15.732833 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.732784 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:06:15.732833 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.732800 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:06:15.732833 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.732813 2576 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:06:15.732833 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.732826 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:06:15.734398 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.734384 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:06:15.734471 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.734408 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:06:15.737589 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.737566 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:06:15.739242 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.739226 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:06:15.740298 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.740286 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:06:15.740366 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.740305 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:06:15.740366 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.740314 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:06:15.740366 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.740322 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:06:15.740366 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.740331 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:06:15.740366 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.740340 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:06:15.740366 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.740348 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:06:15.740366 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.740357 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:06:15.740366 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.740368 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:06:15.740635 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.740377 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:06:15.740635 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.740390 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:06:15.740635 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.740403 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:06:15.741319 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.741306 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:06:15.741373 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.741323 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:06:15.745015 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.744894 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:06:15.745015 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.744951 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-q2fr2" Apr 22 19:06:15.745015 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.744966 2576 server.go:1295] "Started kubelet" Apr 22 19:06:15.745495 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.745460 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:06:15.745654 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.745490 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:06:15.745729 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.745681 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:06:15.745941 ip-10-0-141-191 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:06:15.746059 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.746018 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-191.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:06:15.746059 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:15.746015 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:06:15.746151 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:15.746086 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-191.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:06:15.747318 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.747295 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:06:15.747925 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.747910 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:06:15.751973 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.751954 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-q2fr2" Apr 22 19:06:15.752966 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.752947 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:06:15.753030 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.752975 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:06:15.753475 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:15.752575 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-191.ec2.internal.18a8c34a091347e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-191.ec2.internal,UID:ip-10-0-141-191.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-191.ec2.internal,},FirstTimestamp:2026-04-22 19:06:15.744915431 +0000 UTC m=+0.425393283,LastTimestamp:2026-04-22 19:06:15.744915431 +0000 UTC m=+0.425393283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-191.ec2.internal,}" Apr 22 19:06:15.753819 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:15.753798 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-191.ec2.internal\" not found" Apr 22 19:06:15.754271 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.754254 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:06:15.754346 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.754256 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:06:15.754346 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.754287 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:06:15.754417 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.754379 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:06:15.754417 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.754390 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:06:15.754927 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.754911 2576 factory.go:55] Registering systemd factory Apr 22 19:06:15.754927 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.754928 2576 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:06:15.755175 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.755150 2576 factory.go:153] Registering CRI-O factory Apr 22 19:06:15.755175 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.755171 2576 factory.go:223] Registration of the crio container factory successfully Apr 22 19:06:15.755296 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.755241 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:06:15.755296 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.755266 2576 factory.go:103] Registering Raw factory Apr 22 19:06:15.755296 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.755286 2576 manager.go:1196] Started watching for new ooms in manager Apr 22 19:06:15.755789 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.755770 2576 manager.go:319] Starting recovery of all containers Apr 22 19:06:15.764301 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.764154 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:15.765095 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.765078 2576 manager.go:324] Recovery completed Apr 22 19:06:15.766495 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:15.766475 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-191.ec2.internal\" not found" node="ip-10-0-141-191.ec2.internal" Apr 22 19:06:15.770095 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.770081 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:15.772098 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.772085 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:15.772152 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.772109 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:15.772152 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.772119 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:15.772572 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.772557 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:06:15.772634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.772571 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:06:15.772634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.772589 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:06:15.774690 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.774679 2576 policy_none.go:49] "None policy: Start" Apr 22 19:06:15.774738 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.774694 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:06:15.774738 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.774704 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:06:15.809199 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.809182 2576 manager.go:341] "Starting Device Plugin manager" Apr 22 19:06:15.820602 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:15.809217 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:06:15.820602 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.809230 2576 server.go:85] "Starting device plugin registration server" Apr 22 19:06:15.820602 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.809404 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:06:15.820602 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.809412 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:06:15.820602 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.809491 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:06:15.820602 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.809581 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:06:15.820602 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.809589 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:06:15.820602 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:15.810015 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:06:15.820602 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:15.810059 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-191.ec2.internal\" not found" Apr 22 19:06:15.877711 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.877669 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:06:15.878862 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.878847 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:06:15.878912 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.878876 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:06:15.878912 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.878896 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:06:15.878912 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.878902 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:06:15.879033 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:15.878935 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:06:15.881245 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.881224 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:15.910038 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.909991 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:15.910933 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.910920 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:15.911010 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.910953 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:15.911010 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.910970 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:15.911010 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.910999 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-191.ec2.internal" Apr 22 19:06:15.918943 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.918919 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-191.ec2.internal" Apr 22 19:06:15.918943 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:15.918950 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-191.ec2.internal\": node \"ip-10-0-141-191.ec2.internal\" not found" Apr 22 19:06:15.952190 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:15.952162 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-191.ec2.internal\" not found" Apr 22 19:06:15.979745 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.979722 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-191.ec2.internal"] Apr 22 19:06:15.979821 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.979789 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:15.981423 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.981409 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:15.981500 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.981435 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:15.981500 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.981444 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:15.982643 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.982632 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:15.982764 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.982751 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal" Apr 22 19:06:15.982803 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.982791 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:15.983299 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.983284 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:15.983380 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.983287 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:15.983380 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.983314 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:15.983380 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.983328 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:15.983380 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.983330 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:15.983380 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.983351 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:15.984476 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.984461 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-191.ec2.internal" Apr 22 19:06:15.984562 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.984489 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:15.985108 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.985093 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:15.985161 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.985117 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:15.985161 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:15.985127 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:16.013205 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:16.013182 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-191.ec2.internal\" not found" node="ip-10-0-141-191.ec2.internal" Apr 22 19:06:16.017713 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:16.017696 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-191.ec2.internal\" not found" node="ip-10-0-141-191.ec2.internal" Apr 22 19:06:16.052916 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:16.052895 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-191.ec2.internal\" not found" Apr 22 19:06:16.056192 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.056176 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/41b31a6aac0e1ab38c6e962369a1223f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal\" (UID: \"41b31a6aac0e1ab38c6e962369a1223f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal" Apr 22 19:06:16.056240 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.056200 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41b31a6aac0e1ab38c6e962369a1223f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal\" (UID: \"41b31a6aac0e1ab38c6e962369a1223f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal" Apr 22 19:06:16.056240 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.056222 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/967a4594b2f65f314129e2fffff4e1e6-config\") pod \"kube-apiserver-proxy-ip-10-0-141-191.ec2.internal\" (UID: \"967a4594b2f65f314129e2fffff4e1e6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-191.ec2.internal" Apr 22 19:06:16.153480 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:16.153448 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-191.ec2.internal\" not found" Apr 22 19:06:16.156718 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.156705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/41b31a6aac0e1ab38c6e962369a1223f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal\" (UID: \"41b31a6aac0e1ab38c6e962369a1223f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal" Apr 22 19:06:16.156764 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.156728 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41b31a6aac0e1ab38c6e962369a1223f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal\" (UID: \"41b31a6aac0e1ab38c6e962369a1223f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal" Apr 22 19:06:16.156764 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.156743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/967a4594b2f65f314129e2fffff4e1e6-config\") pod \"kube-apiserver-proxy-ip-10-0-141-191.ec2.internal\" (UID: \"967a4594b2f65f314129e2fffff4e1e6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-191.ec2.internal" Apr 22 19:06:16.156858 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.156804 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/967a4594b2f65f314129e2fffff4e1e6-config\") pod \"kube-apiserver-proxy-ip-10-0-141-191.ec2.internal\" (UID: \"967a4594b2f65f314129e2fffff4e1e6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-191.ec2.internal" Apr 22 19:06:16.156858 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.156807 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/41b31a6aac0e1ab38c6e962369a1223f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal\" (UID: \"41b31a6aac0e1ab38c6e962369a1223f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal" Apr 22 19:06:16.156858 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.156828 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41b31a6aac0e1ab38c6e962369a1223f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal\" (UID: \"41b31a6aac0e1ab38c6e962369a1223f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal" Apr 22 19:06:16.254163 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:16.254092 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-191.ec2.internal\" not found" Apr 22 19:06:16.315513 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.315485 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal" Apr 22 19:06:16.320056 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.320037 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-191.ec2.internal" Apr 22 19:06:16.354789 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:16.354760 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-191.ec2.internal\" not found" Apr 22 19:06:16.455235 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:16.455197 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-191.ec2.internal\" not found" Apr 22 19:06:16.555691 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:16.555612 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-191.ec2.internal\" not found" Apr 22 19:06:16.656232 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:16.656200 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-191.ec2.internal\" not found" Apr 22 19:06:16.676494 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.676478 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:06:16.677112 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.676612 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:06:16.677112 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.676631 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:06:16.753854 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.753828 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:06:16.754301 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.754271 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:01:15 +0000 UTC" deadline="2027-12-20 12:08:53.059765495 +0000 UTC" Apr 22 19:06:16.754346 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.754301 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14561h2m36.305466981s" Apr 22 19:06:16.757157 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:16.757138 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-191.ec2.internal\" not found" Apr 22 19:06:16.763416 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.763392 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:06:16.783362 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.783344 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wvq7s" Apr 22 19:06:16.790988 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.790967 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wvq7s" Apr 22 19:06:16.812875 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.812833 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:16.853616 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.853592 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-191.ec2.internal" Apr 22 19:06:16.861906 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.861888 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:06:16.864309 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:16.864270 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod967a4594b2f65f314129e2fffff4e1e6.slice/crio-260a8c877eeaf9001307a8aa56b21240391b6603a913cd2edf639693aea27373 WatchSource:0}: Error finding container 260a8c877eeaf9001307a8aa56b21240391b6603a913cd2edf639693aea27373: Status 404 returned error can't find the container with id 260a8c877eeaf9001307a8aa56b21240391b6603a913cd2edf639693aea27373 Apr 22 19:06:16.864820 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:16.864801 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41b31a6aac0e1ab38c6e962369a1223f.slice/crio-db090451cc8b87903a0e5cff79ebff19e3cdc97f026f6c19dfa47ebc48d0c168 WatchSource:0}: Error finding container db090451cc8b87903a0e5cff79ebff19e3cdc97f026f6c19dfa47ebc48d0c168: Status 404 returned error can't find the container with id db090451cc8b87903a0e5cff79ebff19e3cdc97f026f6c19dfa47ebc48d0c168 Apr 22 19:06:16.865461 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.865445 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal" Apr 22 19:06:16.868520 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.868508 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:06:16.875267 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.875253 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:06:16.881823 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.881781 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal" event={"ID":"41b31a6aac0e1ab38c6e962369a1223f","Type":"ContainerStarted","Data":"db090451cc8b87903a0e5cff79ebff19e3cdc97f026f6c19dfa47ebc48d0c168"} Apr 22 19:06:16.882523 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:16.882502 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-191.ec2.internal" event={"ID":"967a4594b2f65f314129e2fffff4e1e6","Type":"ContainerStarted","Data":"260a8c877eeaf9001307a8aa56b21240391b6603a913cd2edf639693aea27373"} Apr 22 19:06:17.094724 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.093761 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:17.635673 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.635641 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:17.734133 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.734093 2576 apiserver.go:52] "Watching apiserver" Apr 22 19:06:17.740981 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.740954 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:06:17.742167 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.742137 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7f4rv","kube-system/kube-apiserver-proxy-ip-10-0-141-191.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262","openshift-cluster-node-tuning-operator/tuned-q9hnf","openshift-multus/multus-ktwxr","openshift-network-diagnostics/network-check-target-nfm6p","openshift-network-operator/iptables-alerter-h2chr","kube-system/konnectivity-agent-z48vt","openshift-dns/node-resolver-9gplk","openshift-image-registry/node-ca-sqb4x","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal","openshift-multus/multus-additional-cni-plugins-sgjxl","openshift-multus/network-metrics-daemon-gk4zn"] Apr 22 19:06:17.744529 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.744507 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sqb4x" Apr 22 19:06:17.745821 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.745789 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.746632 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.746609 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:06:17.746735 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.746619 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:06:17.746880 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.746863 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:06:17.747004 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.746985 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8j5vf\"" Apr 22 19:06:17.747702 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.747680 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:06:17.747826 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.747793 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.747923 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.747860 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:06:17.747923 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.747912 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:06:17.748066 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.748048 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5zjtj\"" Apr 22 19:06:17.749207 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.749189 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:17.749317 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:17.749273 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nfm6p" podUID="de257b4d-6fda-4abf-96a2-b516950ed9ef" Apr 22 19:06:17.749855 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.749782 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:06:17.749986 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.749861 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:06:17.750671 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.750091 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:06:17.750671 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.750162 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jp9nb\"" Apr 22 19:06:17.750671 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.750091 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:06:17.750671 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.750363 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h2chr" Apr 22 19:06:17.751724 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.751628 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9gplk" Apr 22 19:06:17.752633 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.752437 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ddjjq\"" Apr 22 19:06:17.752633 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.752443 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:06:17.752633 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.752526 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:06:17.752816 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.752791 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.753629 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.753606 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:06:17.753738 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.753677 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-546cr\"" Apr 22 19:06:17.753738 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.753729 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:06:17.754965 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.754945 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:06:17.756662 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.755256 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.756662 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.755316 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:06:17.756662 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.755350 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:06:17.756662 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.755844 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-zh7lz\"" Apr 22 19:06:17.756662 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.756031 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:06:17.756662 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.756218 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:06:17.756662 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.756608 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:17.757087 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.756666 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:06:17.757087 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:17.756687 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gk4zn" podUID="42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc" Apr 22 19:06:17.757087 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.756760 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:06:17.757087 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.756966 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-z48vt" Apr 22 19:06:17.757703 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.757682 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:06:17.757936 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.757920 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9rstv\"" Apr 22 19:06:17.758811 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.758789 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.759326 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.759306 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:06:17.759409 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.759389 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:06:17.760291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.760274 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-nd968\"" Apr 22 19:06:17.760387 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.760293 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:06:17.760850 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.760835 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-qwmh9\"" Apr 22 19:06:17.760850 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.760842 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:06:17.761144 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.761121 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:06:17.764952 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.764934 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb917968-4a52-4305-8b42-7cfc0d5bf83c-tmp-dir\") pod \"node-resolver-9gplk\" (UID: \"eb917968-4a52-4305-8b42-7cfc0d5bf83c\") " pod="openshift-dns/node-resolver-9gplk" Apr 22 19:06:17.765042 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.764981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-system-cni-dir\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.765042 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765008 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/677eb918-a02c-47e7-853e-5d091e94e4e3-konnectivity-ca\") pod \"konnectivity-agent-z48vt\" (UID: \"677eb918-a02c-47e7-853e-5d091e94e4e3\") " pod="kube-system/konnectivity-agent-z48vt" Apr 22 19:06:17.765159 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-run-ovn\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.765159 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765114 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65c2e60e-a686-4be8-bb8d-33be235b8b32-ovnkube-script-lib\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.765249 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765161 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-etc-selinux\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.765249 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/677eb918-a02c-47e7-853e-5d091e94e4e3-agent-certs\") pod \"konnectivity-agent-z48vt\" (UID: \"677eb918-a02c-47e7-853e-5d091e94e4e3\") " pod="kube-system/konnectivity-agent-z48vt" Apr 22 19:06:17.765345 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765245 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-var-lib-kubelet\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.765345 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-tuned\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.765345 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765296 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-multus-cni-dir\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.765345 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-run-k8s-cni-cncf-io\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.765526 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765347 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-run-multus-certs\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.765526 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d44m\" (UniqueName: \"kubernetes.io/projected/80efecb9-2d19-4b94-9cd9-9207adab24a3-kube-api-access-7d44m\") pod \"iptables-alerter-h2chr\" (UID: \"80efecb9-2d19-4b94-9cd9-9207adab24a3\") " pod="openshift-network-operator/iptables-alerter-h2chr" Apr 22 19:06:17.765526 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765395 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-log-socket\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.765526 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765434 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-run-ovn-kubernetes\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.765526 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-sysctl-conf\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.765526 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765484 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-var-lib-cni-bin\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.765526 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765504 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-multus-conf-dir\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.765526 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765520 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eb917968-4a52-4305-8b42-7cfc0d5bf83c-hosts-file\") pod \"node-resolver-9gplk\" (UID: \"eb917968-4a52-4305-8b42-7cfc0d5bf83c\") " pod="openshift-dns/node-resolver-9gplk" Apr 22 19:06:17.765938 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765533 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-run-netns\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.765938 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765563 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65c2e60e-a686-4be8-bb8d-33be235b8b32-env-overrides\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.765938 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765583 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-modprobe-d\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.765938 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765607 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.765938 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765654 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8b12af7b-06a9-4788-95b8-dc94a26738fe-multus-daemon-config\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.765938 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765681 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/80efecb9-2d19-4b94-9cd9-9207adab24a3-iptables-alerter-script\") pod \"iptables-alerter-h2chr\" (UID: \"80efecb9-2d19-4b94-9cd9-9207adab24a3\") " pod="openshift-network-operator/iptables-alerter-h2chr" Apr 22 19:06:17.765938 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765707 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2jtv\" (UniqueName: \"kubernetes.io/projected/192f282f-5991-460f-a756-3f4a540d048a-kube-api-access-k2jtv\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.765938 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765729 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-kubelet\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.765938 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-etc-openvswitch\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.765938 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765786 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-run-openvswitch\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.765938 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765805 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-tmp\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.765938 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-var-lib-kubelet\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.765938 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765845 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-run-systemd\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.765938 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765867 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-var-lib-openvswitch\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.765938 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-cni-netd\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.765938 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765914 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm7zk\" (UniqueName: \"kubernetes.io/projected/eb917968-4a52-4305-8b42-7cfc0d5bf83c-kube-api-access-xm7zk\") pod \"node-resolver-9gplk\" (UID: \"eb917968-4a52-4305-8b42-7cfc0d5bf83c\") " pod="openshift-dns/node-resolver-9gplk" Apr 22 19:06:17.766674 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765939 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-systemd-units\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.766674 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765958 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0f709c0-fc28-4eab-9cf8-603681f7f300-host\") pod \"node-ca-sqb4x\" (UID: \"e0f709c0-fc28-4eab-9cf8-603681f7f300\") " pod="openshift-image-registry/node-ca-sqb4x" Apr 22 19:06:17.766674 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8b12af7b-06a9-4788-95b8-dc94a26738fe-cni-binary-copy\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.766674 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.765997 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-kubernetes\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.766674 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766011 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.766674 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766026 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tttn7\" (UniqueName: \"kubernetes.io/projected/65c2e60e-a686-4be8-bb8d-33be235b8b32-kube-api-access-tttn7\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.766674 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766044 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-sysconfig\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.766674 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766063 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blvtm\" (UniqueName: \"kubernetes.io/projected/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-kube-api-access-blvtm\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.766674 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766077 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-sysctl-d\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.766674 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766090 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-hostroot\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.766674 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766103 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkflh\" (UniqueName: \"kubernetes.io/projected/8b12af7b-06a9-4788-95b8-dc94a26738fe-kube-api-access-dkflh\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.766674 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766118 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/80efecb9-2d19-4b94-9cd9-9207adab24a3-host-slash\") pod \"iptables-alerter-h2chr\" (UID: \"80efecb9-2d19-4b94-9cd9-9207adab24a3\") " pod="openshift-network-operator/iptables-alerter-h2chr" Apr 22 19:06:17.766674 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766158 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-multus-socket-dir-parent\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.766674 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-var-lib-cni-multus\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.766674 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766204 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-lib-modules\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.766674 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766228 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e0f709c0-fc28-4eab-9cf8-603681f7f300-serviceca\") pod \"node-ca-sqb4x\" (UID: \"e0f709c0-fc28-4eab-9cf8-603681f7f300\") " pod="openshift-image-registry/node-ca-sqb4x" Apr 22 19:06:17.767395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766247 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-sys-fs\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.767395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-cnibin\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.767395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766293 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-cni-bin\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.767395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs\") pod \"network-metrics-daemon-gk4zn\" (UID: \"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc\") " pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:17.767395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766323 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-systemd\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.767395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh7zs\" (UniqueName: \"kubernetes.io/projected/e0f709c0-fc28-4eab-9cf8-603681f7f300-kube-api-access-mh7zs\") pod \"node-ca-sqb4x\" (UID: \"e0f709c0-fc28-4eab-9cf8-603681f7f300\") " pod="openshift-image-registry/node-ca-sqb4x" Apr 22 19:06:17.767395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766370 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-os-release\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.767395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766401 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-node-log\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.767395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766429 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k99bl\" (UniqueName: \"kubernetes.io/projected/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-kube-api-access-k99bl\") pod \"network-metrics-daemon-gk4zn\" (UID: \"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc\") " pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:17.767395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766451 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-sys\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.767395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766473 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-host\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.767395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766495 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-registration-dir\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.767395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766525 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vqxc\" (UniqueName: \"kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc\") pod \"network-check-target-nfm6p\" (UID: \"de257b4d-6fda-4abf-96a2-b516950ed9ef\") " pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:17.767395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766597 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-run-netns\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.767395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766642 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-etc-kubernetes\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.767395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766665 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65c2e60e-a686-4be8-bb8d-33be235b8b32-ovnkube-config\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.767395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766688 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65c2e60e-a686-4be8-bb8d-33be235b8b32-ovn-node-metrics-cert\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.768269 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766712 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-socket-dir\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.768269 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766766 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-device-dir\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.768269 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-slash\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.768269 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.766811 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-run\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.791697 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.791670 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:01:16 +0000 UTC" deadline="2027-09-28 21:11:57.61779059 +0000 UTC" Apr 22 19:06:17.791815 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.791698 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12578h5m39.826095635s" Apr 22 19:06:17.855593 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.855564 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:06:17.867236 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867216 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-host\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.867334 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-registration-dir\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.867334 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867268 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0d1ae74c-067a-489d-a7eb-707cf1b181a7-cnibin\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.867334 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867295 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0d1ae74c-067a-489d-a7eb-707cf1b181a7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.867493 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867336 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vqxc\" (UniqueName: \"kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc\") pod \"network-check-target-nfm6p\" (UID: \"de257b4d-6fda-4abf-96a2-b516950ed9ef\") " pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:17.867493 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-run-netns\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.867493 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867343 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-host\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.867493 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-etc-kubernetes\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.867493 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867342 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-registration-dir\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.867493 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65c2e60e-a686-4be8-bb8d-33be235b8b32-ovnkube-config\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.867493 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-run-netns\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.867493 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-etc-kubernetes\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.867493 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65c2e60e-a686-4be8-bb8d-33be235b8b32-ovn-node-metrics-cert\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.867941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-socket-dir\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.867941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-device-dir\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.867941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867597 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-slash\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.867941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-run\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.867941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867632 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-device-dir\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.867941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867672 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-slash\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.867941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-socket-dir\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.867941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0d1ae74c-067a-489d-a7eb-707cf1b181a7-cni-binary-copy\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.867941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-run\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.867941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb917968-4a52-4305-8b42-7cfc0d5bf83c-tmp-dir\") pod \"node-resolver-9gplk\" (UID: \"eb917968-4a52-4305-8b42-7cfc0d5bf83c\") " pod="openshift-dns/node-resolver-9gplk" Apr 22 19:06:17.867941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867779 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-system-cni-dir\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.867941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867794 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:06:17.867941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/677eb918-a02c-47e7-853e-5d091e94e4e3-konnectivity-ca\") pod \"konnectivity-agent-z48vt\" (UID: \"677eb918-a02c-47e7-853e-5d091e94e4e3\") " pod="kube-system/konnectivity-agent-z48vt" Apr 22 19:06:17.867941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867885 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-system-cni-dir\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.868634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-run-ovn\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.868634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.867985 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65c2e60e-a686-4be8-bb8d-33be235b8b32-ovnkube-script-lib\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.868634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-etc-selinux\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.868634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868029 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-run-ovn\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.868634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868046 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65c2e60e-a686-4be8-bb8d-33be235b8b32-ovnkube-config\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.868634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868056 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb917968-4a52-4305-8b42-7cfc0d5bf83c-tmp-dir\") pod \"node-resolver-9gplk\" (UID: \"eb917968-4a52-4305-8b42-7cfc0d5bf83c\") " pod="openshift-dns/node-resolver-9gplk" Apr 22 19:06:17.868634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868048 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0d1ae74c-067a-489d-a7eb-707cf1b181a7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.868634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868103 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0d1ae74c-067a-489d-a7eb-707cf1b181a7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.868634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868140 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/677eb918-a02c-47e7-853e-5d091e94e4e3-agent-certs\") pod \"konnectivity-agent-z48vt\" (UID: \"677eb918-a02c-47e7-853e-5d091e94e4e3\") " pod="kube-system/konnectivity-agent-z48vt" Apr 22 19:06:17.868634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868171 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-var-lib-kubelet\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.868634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-etc-selinux\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.868634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-tuned\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.868634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-multus-cni-dir\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.868634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-var-lib-kubelet\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.868634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868246 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-run-k8s-cni-cncf-io\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.868634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-run-multus-certs\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.868634 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868284 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-multus-cni-dir\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7d44m\" (UniqueName: \"kubernetes.io/projected/80efecb9-2d19-4b94-9cd9-9207adab24a3-kube-api-access-7d44m\") pod \"iptables-alerter-h2chr\" (UID: \"80efecb9-2d19-4b94-9cd9-9207adab24a3\") " pod="openshift-network-operator/iptables-alerter-h2chr" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868318 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-run-k8s-cni-cncf-io\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-log-socket\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-run-multus-certs\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868351 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-run-ovn-kubernetes\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-run-ovn-kubernetes\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-sysctl-conf\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868437 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-var-lib-cni-bin\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/677eb918-a02c-47e7-853e-5d091e94e4e3-konnectivity-ca\") pod \"konnectivity-agent-z48vt\" (UID: \"677eb918-a02c-47e7-853e-5d091e94e4e3\") " pod="kube-system/konnectivity-agent-z48vt" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-multus-conf-dir\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868484 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-multus-conf-dir\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eb917968-4a52-4305-8b42-7cfc0d5bf83c-hosts-file\") pod \"node-resolver-9gplk\" (UID: \"eb917968-4a52-4305-8b42-7cfc0d5bf83c\") " pod="openshift-dns/node-resolver-9gplk" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868519 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65c2e60e-a686-4be8-bb8d-33be235b8b32-ovnkube-script-lib\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868589 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-var-lib-cni-bin\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-run-netns\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65c2e60e-a686-4be8-bb8d-33be235b8b32-env-overrides\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868635 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-run-netns\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.869374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868656 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-modprobe-d\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.870185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868653 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eb917968-4a52-4305-8b42-7cfc0d5bf83c-hosts-file\") pod \"node-resolver-9gplk\" (UID: \"eb917968-4a52-4305-8b42-7cfc0d5bf83c\") " pod="openshift-dns/node-resolver-9gplk" Apr 22 19:06:17.870185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.870185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868700 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8b12af7b-06a9-4788-95b8-dc94a26738fe-multus-daemon-config\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.870185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/80efecb9-2d19-4b94-9cd9-9207adab24a3-iptables-alerter-script\") pod \"iptables-alerter-h2chr\" (UID: \"80efecb9-2d19-4b94-9cd9-9207adab24a3\") " pod="openshift-network-operator/iptables-alerter-h2chr" Apr 22 19:06:17.870185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868747 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-sysctl-conf\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.870185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2jtv\" (UniqueName: \"kubernetes.io/projected/192f282f-5991-460f-a756-3f4a540d048a-kube-api-access-k2jtv\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.870185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-kubelet\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.870185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868779 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-log-socket\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.870185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-etc-openvswitch\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.870185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868813 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-etc-openvswitch\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.870185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868874 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-run-openvswitch\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.870185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868890 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.870185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868898 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-tmp\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.870185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868924 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-var-lib-kubelet\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.870185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-run-systemd\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.870185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-var-lib-openvswitch\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.870185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-cni-netd\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.870865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.868996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-modprobe-d\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.870865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xm7zk\" (UniqueName: \"kubernetes.io/projected/eb917968-4a52-4305-8b42-7cfc0d5bf83c-kube-api-access-xm7zk\") pod \"node-resolver-9gplk\" (UID: \"eb917968-4a52-4305-8b42-7cfc0d5bf83c\") " pod="openshift-dns/node-resolver-9gplk" Apr 22 19:06:17.870865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869053 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-systemd-units\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.870865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869061 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65c2e60e-a686-4be8-bb8d-33be235b8b32-env-overrides\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.870865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869094 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0f709c0-fc28-4eab-9cf8-603681f7f300-host\") pod \"node-ca-sqb4x\" (UID: \"e0f709c0-fc28-4eab-9cf8-603681f7f300\") " pod="openshift-image-registry/node-ca-sqb4x" Apr 22 19:06:17.870865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8b12af7b-06a9-4788-95b8-dc94a26738fe-cni-binary-copy\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.870865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-kubernetes\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.870865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869174 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0d1ae74c-067a-489d-a7eb-707cf1b181a7-system-cni-dir\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.870865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869152 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-kubelet\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.870865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869203 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.870865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869235 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tttn7\" (UniqueName: \"kubernetes.io/projected/65c2e60e-a686-4be8-bb8d-33be235b8b32-kube-api-access-tttn7\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.870865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869249 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-run-openvswitch\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.870865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-sysconfig\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.870865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869286 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blvtm\" (UniqueName: \"kubernetes.io/projected/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-kube-api-access-blvtm\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.870865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869315 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqsbx\" (UniqueName: \"kubernetes.io/projected/0d1ae74c-067a-489d-a7eb-707cf1b181a7-kube-api-access-sqsbx\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.870865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-sysctl-d\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.870865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869371 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-hostroot\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkflh\" (UniqueName: \"kubernetes.io/projected/8b12af7b-06a9-4788-95b8-dc94a26738fe-kube-api-access-dkflh\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/80efecb9-2d19-4b94-9cd9-9207adab24a3-host-slash\") pod \"iptables-alerter-h2chr\" (UID: \"80efecb9-2d19-4b94-9cd9-9207adab24a3\") " pod="openshift-network-operator/iptables-alerter-h2chr" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869523 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8b12af7b-06a9-4788-95b8-dc94a26738fe-multus-daemon-config\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869571 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-multus-socket-dir-parent\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-var-lib-cni-multus\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869706 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/80efecb9-2d19-4b94-9cd9-9207adab24a3-iptables-alerter-script\") pod \"iptables-alerter-h2chr\" (UID: \"80efecb9-2d19-4b94-9cd9-9207adab24a3\") " pod="openshift-network-operator/iptables-alerter-h2chr" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-lib-modules\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e0f709c0-fc28-4eab-9cf8-603681f7f300-serviceca\") pod \"node-ca-sqb4x\" (UID: \"e0f709c0-fc28-4eab-9cf8-603681f7f300\") " pod="openshift-image-registry/node-ca-sqb4x" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869802 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-sys-fs\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869829 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-cnibin\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869852 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-cni-bin\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869855 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-hostroot\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.870375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs\") pod \"network-metrics-daemon-gk4zn\" (UID: \"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc\") " pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.870441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-systemd\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.870483 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mh7zs\" (UniqueName: \"kubernetes.io/projected/e0f709c0-fc28-4eab-9cf8-603681f7f300-kube-api-access-mh7zs\") pod \"node-ca-sqb4x\" (UID: \"e0f709c0-fc28-4eab-9cf8-603681f7f300\") " pod="openshift-image-registry/node-ca-sqb4x" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.870518 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0d1ae74c-067a-489d-a7eb-707cf1b181a7-os-release\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.870604 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-var-lib-kubelet\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.871682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.870863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-os-release\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.872320 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.870900 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-node-log\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.872320 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.870945 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k99bl\" (UniqueName: \"kubernetes.io/projected/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-kube-api-access-k99bl\") pod \"network-metrics-daemon-gk4zn\" (UID: \"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc\") " pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:17.872320 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.870976 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-sys\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.872320 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.871078 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-sys\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.872320 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.871133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-node-log\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.872320 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.871565 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-tuned\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.872320 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.871682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-systemd-units\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.872320 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.871767 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-var-lib-openvswitch\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.872320 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.871862 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-cni-netd\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.873117 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.873094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8b12af7b-06a9-4788-95b8-dc94a26738fe-cni-binary-copy\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.873232 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.873196 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0f709c0-fc28-4eab-9cf8-603681f7f300-host\") pod \"node-ca-sqb4x\" (UID: \"e0f709c0-fc28-4eab-9cf8-603681f7f300\") " pod="openshift-image-registry/node-ca-sqb4x" Apr 22 19:06:17.873761 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.873740 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-sysconfig\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.873863 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.869123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-run-systemd\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.874147 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:17.874129 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:17.874213 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:17.874154 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:17.874213 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:17.874169 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7vqxc for pod openshift-network-diagnostics/network-check-target-nfm6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:17.877573 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.874218 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-multus-socket-dir-parent\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.877573 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.874296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-host-var-lib-cni-multus\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.877573 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:17.874355 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc podName:de257b4d-6fda-4abf-96a2-b516950ed9ef nodeName:}" failed. No retries permitted until 2026-04-22 19:06:18.374308171 +0000 UTC m=+3.054786024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7vqxc" (UniqueName: "kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc") pod "network-check-target-nfm6p" (UID: "de257b4d-6fda-4abf-96a2-b516950ed9ef") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:17.877573 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.874386 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-lib-modules\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.877573 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.874839 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e0f709c0-fc28-4eab-9cf8-603681f7f300-serviceca\") pod \"node-ca-sqb4x\" (UID: \"e0f709c0-fc28-4eab-9cf8-603681f7f300\") " pod="openshift-image-registry/node-ca-sqb4x" Apr 22 19:06:17.877573 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.874920 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/192f282f-5991-460f-a756-3f4a540d048a-sys-fs\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.877573 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.874972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-kubernetes\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.877573 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.875066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-sysctl-d\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.877573 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:17.875145 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:17.877573 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:17.875329 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs podName:42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc nodeName:}" failed. No retries permitted until 2026-04-22 19:06:18.375298744 +0000 UTC m=+3.055776581 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs") pod "network-metrics-daemon-gk4zn" (UID: "42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:17.877573 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.875419 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-cnibin\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.877573 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.875472 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-etc-systemd\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.877573 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.875668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-cni-bin\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.877573 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.876036 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/677eb918-a02c-47e7-853e-5d091e94e4e3-agent-certs\") pod \"konnectivity-agent-z48vt\" (UID: \"677eb918-a02c-47e7-853e-5d091e94e4e3\") " pod="kube-system/konnectivity-agent-z48vt" Apr 22 19:06:17.877573 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.876197 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/80efecb9-2d19-4b94-9cd9-9207adab24a3-host-slash\") pod \"iptables-alerter-h2chr\" (UID: \"80efecb9-2d19-4b94-9cd9-9207adab24a3\") " pod="openshift-network-operator/iptables-alerter-h2chr" Apr 22 19:06:17.877573 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.876368 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65c2e60e-a686-4be8-bb8d-33be235b8b32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.878380 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.876484 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8b12af7b-06a9-4788-95b8-dc94a26738fe-os-release\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.878380 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.877817 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65c2e60e-a686-4be8-bb8d-33be235b8b32-ovn-node-metrics-cert\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.879045 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.878479 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d44m\" (UniqueName: \"kubernetes.io/projected/80efecb9-2d19-4b94-9cd9-9207adab24a3-kube-api-access-7d44m\") pod \"iptables-alerter-h2chr\" (UID: \"80efecb9-2d19-4b94-9cd9-9207adab24a3\") " pod="openshift-network-operator/iptables-alerter-h2chr" Apr 22 19:06:17.879816 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.879380 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2jtv\" (UniqueName: \"kubernetes.io/projected/192f282f-5991-460f-a756-3f4a540d048a-kube-api-access-k2jtv\") pod \"aws-ebs-csi-driver-node-67262\" (UID: \"192f282f-5991-460f-a756-3f4a540d048a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:17.880329 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.880302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-tmp\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.882871 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.882821 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm7zk\" (UniqueName: \"kubernetes.io/projected/eb917968-4a52-4305-8b42-7cfc0d5bf83c-kube-api-access-xm7zk\") pod \"node-resolver-9gplk\" (UID: \"eb917968-4a52-4305-8b42-7cfc0d5bf83c\") " pod="openshift-dns/node-resolver-9gplk" Apr 22 19:06:17.883082 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.883059 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blvtm\" (UniqueName: \"kubernetes.io/projected/59064f5c-f3de-4cf6-b0ff-2f4ffea972d8-kube-api-access-blvtm\") pod \"tuned-q9hnf\" (UID: \"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8\") " pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:17.883570 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.883516 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k99bl\" (UniqueName: \"kubernetes.io/projected/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-kube-api-access-k99bl\") pod \"network-metrics-daemon-gk4zn\" (UID: \"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc\") " pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:17.884642 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.884623 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tttn7\" (UniqueName: \"kubernetes.io/projected/65c2e60e-a686-4be8-bb8d-33be235b8b32-kube-api-access-tttn7\") pod \"ovnkube-node-7f4rv\" (UID: \"65c2e60e-a686-4be8-bb8d-33be235b8b32\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:17.884878 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.884859 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkflh\" (UniqueName: \"kubernetes.io/projected/8b12af7b-06a9-4788-95b8-dc94a26738fe-kube-api-access-dkflh\") pod \"multus-ktwxr\" (UID: \"8b12af7b-06a9-4788-95b8-dc94a26738fe\") " pod="openshift-multus/multus-ktwxr" Apr 22 19:06:17.885160 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.885138 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh7zs\" (UniqueName: \"kubernetes.io/projected/e0f709c0-fc28-4eab-9cf8-603681f7f300-kube-api-access-mh7zs\") pod \"node-ca-sqb4x\" (UID: \"e0f709c0-fc28-4eab-9cf8-603681f7f300\") " pod="openshift-image-registry/node-ca-sqb4x" Apr 22 19:06:17.949454 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.949383 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:17.971720 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.971676 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0d1ae74c-067a-489d-a7eb-707cf1b181a7-os-release\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.971720 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.971706 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0d1ae74c-067a-489d-a7eb-707cf1b181a7-cnibin\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.971935 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.971724 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0d1ae74c-067a-489d-a7eb-707cf1b181a7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.971935 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.971752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0d1ae74c-067a-489d-a7eb-707cf1b181a7-cni-binary-copy\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.971935 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.971771 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0d1ae74c-067a-489d-a7eb-707cf1b181a7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.971935 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.971789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0d1ae74c-067a-489d-a7eb-707cf1b181a7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.971935 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.971794 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0d1ae74c-067a-489d-a7eb-707cf1b181a7-cnibin\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.971935 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.971831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0d1ae74c-067a-489d-a7eb-707cf1b181a7-os-release\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.971935 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.971853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0d1ae74c-067a-489d-a7eb-707cf1b181a7-system-cni-dir\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.971935 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.971889 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0d1ae74c-067a-489d-a7eb-707cf1b181a7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.972223 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.971979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqsbx\" (UniqueName: \"kubernetes.io/projected/0d1ae74c-067a-489d-a7eb-707cf1b181a7-kube-api-access-sqsbx\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.972223 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.971996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0d1ae74c-067a-489d-a7eb-707cf1b181a7-system-cni-dir\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.972295 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.972274 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0d1ae74c-067a-489d-a7eb-707cf1b181a7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.972388 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.972369 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0d1ae74c-067a-489d-a7eb-707cf1b181a7-cni-binary-copy\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:17.972431 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:17.972366 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0d1ae74c-067a-489d-a7eb-707cf1b181a7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:18.005517 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.005494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqsbx\" (UniqueName: \"kubernetes.io/projected/0d1ae74c-067a-489d-a7eb-707cf1b181a7-kube-api-access-sqsbx\") pod \"multus-additional-cni-plugins-sgjxl\" (UID: \"0d1ae74c-067a-489d-a7eb-707cf1b181a7\") " pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:18.059484 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.059463 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sqb4x" Apr 22 19:06:18.070131 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.070111 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" Apr 22 19:06:18.077864 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.077849 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ktwxr" Apr 22 19:06:18.083165 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.083149 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h2chr" Apr 22 19:06:18.088680 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.088662 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9gplk" Apr 22 19:06:18.096278 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.096261 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:18.100819 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.100801 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" Apr 22 19:06:18.107304 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.107289 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-z48vt" Apr 22 19:06:18.113780 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.113764 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sgjxl" Apr 22 19:06:18.375064 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.374987 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vqxc\" (UniqueName: \"kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc\") pod \"network-check-target-nfm6p\" (UID: \"de257b4d-6fda-4abf-96a2-b516950ed9ef\") " pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:18.375216 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:18.375140 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:18.375216 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:18.375161 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:18.375216 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:18.375172 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7vqxc for pod openshift-network-diagnostics/network-check-target-nfm6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:18.375363 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:18.375237 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc podName:de257b4d-6fda-4abf-96a2-b516950ed9ef nodeName:}" failed. No retries permitted until 2026-04-22 19:06:19.375217722 +0000 UTC m=+4.055695574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7vqxc" (UniqueName: "kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc") pod "network-check-target-nfm6p" (UID: "de257b4d-6fda-4abf-96a2-b516950ed9ef") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:18.475915 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.475889 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs\") pod \"network-metrics-daemon-gk4zn\" (UID: \"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc\") " pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:18.476014 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:18.476002 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:18.476072 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:18.476063 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs podName:42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc nodeName:}" failed. No retries permitted until 2026-04-22 19:06:19.476049753 +0000 UTC m=+4.156527590 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs") pod "network-metrics-daemon-gk4zn" (UID: "42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:18.490822 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:18.490797 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod677eb918_a02c_47e7_853e_5d091e94e4e3.slice/crio-388c5387a8dbd7e32a9470178ab8fc2c704fad90e5b891f9abf298ec524d5166 WatchSource:0}: Error finding container 388c5387a8dbd7e32a9470178ab8fc2c704fad90e5b891f9abf298ec524d5166: Status 404 returned error can't find the container with id 388c5387a8dbd7e32a9470178ab8fc2c704fad90e5b891f9abf298ec524d5166 Apr 22 19:06:18.491962 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:18.491881 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80efecb9_2d19_4b94_9cd9_9207adab24a3.slice/crio-2a30a32bae3aff1c778e4dd882e92039d2873b0536285e0e9c3cb4ffe0106b1a WatchSource:0}: Error finding container 2a30a32bae3aff1c778e4dd882e92039d2873b0536285e0e9c3cb4ffe0106b1a: Status 404 returned error can't find the container with id 2a30a32bae3aff1c778e4dd882e92039d2873b0536285e0e9c3cb4ffe0106b1a Apr 22 19:06:18.492927 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:18.492763 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0f709c0_fc28_4eab_9cf8_603681f7f300.slice/crio-b853fbe0cdc1f796faee396e52a6dde25322f1058548a177d558f001e35befb3 WatchSource:0}: Error finding container b853fbe0cdc1f796faee396e52a6dde25322f1058548a177d558f001e35befb3: Status 404 returned error can't find the container with id b853fbe0cdc1f796faee396e52a6dde25322f1058548a177d558f001e35befb3 Apr 22 19:06:18.493290 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:18.493269 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59064f5c_f3de_4cf6_b0ff_2f4ffea972d8.slice/crio-f3f6b549e59436911a6276f3599d01240a479302e50f31219a49d7b6eca7fa3b WatchSource:0}: Error finding container f3f6b549e59436911a6276f3599d01240a479302e50f31219a49d7b6eca7fa3b: Status 404 returned error can't find the container with id f3f6b549e59436911a6276f3599d01240a479302e50f31219a49d7b6eca7fa3b Apr 22 19:06:18.494856 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:18.494824 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65c2e60e_a686_4be8_bb8d_33be235b8b32.slice/crio-148121326eb2a946b6501d327dcd203823d5253b094489838daca5225fadb791 WatchSource:0}: Error finding container 148121326eb2a946b6501d327dcd203823d5253b094489838daca5225fadb791: Status 404 returned error can't find the container with id 148121326eb2a946b6501d327dcd203823d5253b094489838daca5225fadb791 Apr 22 19:06:18.498504 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:18.498466 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb917968_4a52_4305_8b42_7cfc0d5bf83c.slice/crio-a3095076930095b5a3d7070a9fc8e2732cdcf4cf41b94ffe4522e948e819df0d WatchSource:0}: Error finding container a3095076930095b5a3d7070a9fc8e2732cdcf4cf41b94ffe4522e948e819df0d: Status 404 returned error can't find the container with id a3095076930095b5a3d7070a9fc8e2732cdcf4cf41b94ffe4522e948e819df0d Apr 22 19:06:18.499488 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:18.499465 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d1ae74c_067a_489d_a7eb_707cf1b181a7.slice/crio-6e8ea51ed1ea5599a993c29cb6c2933a84e3c31a862d38810cf020742b3e99f8 WatchSource:0}: Error finding container 6e8ea51ed1ea5599a993c29cb6c2933a84e3c31a862d38810cf020742b3e99f8: Status 404 returned error can't find the container with id 6e8ea51ed1ea5599a993c29cb6c2933a84e3c31a862d38810cf020742b3e99f8 Apr 22 19:06:18.500063 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:18.499880 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod192f282f_5991_460f_a756_3f4a540d048a.slice/crio-8d7cdca274dca87c152e355d52376498921ab93cefb0b3346a90380f1bab3eea WatchSource:0}: Error finding container 8d7cdca274dca87c152e355d52376498921ab93cefb0b3346a90380f1bab3eea: Status 404 returned error can't find the container with id 8d7cdca274dca87c152e355d52376498921ab93cefb0b3346a90380f1bab3eea Apr 22 19:06:18.501088 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:18.500974 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b12af7b_06a9_4788_95b8_dc94a26738fe.slice/crio-8b9b21d7ee9c5bf9fceccc98ea51930fb79eeb996ee9db00f4e7c2bf6f289d9d WatchSource:0}: Error finding container 8b9b21d7ee9c5bf9fceccc98ea51930fb79eeb996ee9db00f4e7c2bf6f289d9d: Status 404 returned error can't find the container with id 8b9b21d7ee9c5bf9fceccc98ea51930fb79eeb996ee9db00f4e7c2bf6f289d9d Apr 22 19:06:18.792082 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.791854 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:01:16 +0000 UTC" deadline="2028-01-14 09:18:35.430148478 +0000 UTC" Apr 22 19:06:18.792082 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.792021 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15158h12m16.638130075s" Apr 22 19:06:18.888122 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.888086 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-191.ec2.internal" event={"ID":"967a4594b2f65f314129e2fffff4e1e6","Type":"ContainerStarted","Data":"3a0fbb416b803d2de39e2f61dad7f2db612bea35d826891c52a5dec2b6acb592"} Apr 22 19:06:18.890300 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.890150 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" event={"ID":"192f282f-5991-460f-a756-3f4a540d048a","Type":"ContainerStarted","Data":"8d7cdca274dca87c152e355d52376498921ab93cefb0b3346a90380f1bab3eea"} Apr 22 19:06:18.893517 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.893481 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sgjxl" event={"ID":"0d1ae74c-067a-489d-a7eb-707cf1b181a7","Type":"ContainerStarted","Data":"6e8ea51ed1ea5599a993c29cb6c2933a84e3c31a862d38810cf020742b3e99f8"} Apr 22 19:06:18.894892 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.894862 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sqb4x" event={"ID":"e0f709c0-fc28-4eab-9cf8-603681f7f300","Type":"ContainerStarted","Data":"b853fbe0cdc1f796faee396e52a6dde25322f1058548a177d558f001e35befb3"} Apr 22 19:06:18.896456 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.896427 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" event={"ID":"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8","Type":"ContainerStarted","Data":"f3f6b549e59436911a6276f3599d01240a479302e50f31219a49d7b6eca7fa3b"} Apr 22 19:06:18.898838 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.898716 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-z48vt" event={"ID":"677eb918-a02c-47e7-853e-5d091e94e4e3","Type":"ContainerStarted","Data":"388c5387a8dbd7e32a9470178ab8fc2c704fad90e5b891f9abf298ec524d5166"} Apr 22 19:06:18.899792 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.899762 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ktwxr" event={"ID":"8b12af7b-06a9-4788-95b8-dc94a26738fe","Type":"ContainerStarted","Data":"8b9b21d7ee9c5bf9fceccc98ea51930fb79eeb996ee9db00f4e7c2bf6f289d9d"} Apr 22 19:06:18.902310 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.902279 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9gplk" event={"ID":"eb917968-4a52-4305-8b42-7cfc0d5bf83c","Type":"ContainerStarted","Data":"a3095076930095b5a3d7070a9fc8e2732cdcf4cf41b94ffe4522e948e819df0d"} Apr 22 19:06:18.904141 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.904110 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" event={"ID":"65c2e60e-a686-4be8-bb8d-33be235b8b32","Type":"ContainerStarted","Data":"148121326eb2a946b6501d327dcd203823d5253b094489838daca5225fadb791"} Apr 22 19:06:18.905445 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:18.905423 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h2chr" event={"ID":"80efecb9-2d19-4b94-9cd9-9207adab24a3","Type":"ContainerStarted","Data":"2a30a32bae3aff1c778e4dd882e92039d2873b0536285e0e9c3cb4ffe0106b1a"} Apr 22 19:06:19.384810 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:19.384181 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vqxc\" (UniqueName: \"kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc\") pod \"network-check-target-nfm6p\" (UID: \"de257b4d-6fda-4abf-96a2-b516950ed9ef\") " pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:19.384810 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:19.384328 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:19.384810 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:19.384344 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:19.384810 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:19.384356 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7vqxc for pod openshift-network-diagnostics/network-check-target-nfm6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:19.384810 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:19.384414 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc podName:de257b4d-6fda-4abf-96a2-b516950ed9ef nodeName:}" failed. No retries permitted until 2026-04-22 19:06:21.384396642 +0000 UTC m=+6.064874481 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7vqxc" (UniqueName: "kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc") pod "network-check-target-nfm6p" (UID: "de257b4d-6fda-4abf-96a2-b516950ed9ef") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:19.485484 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:19.484883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs\") pod \"network-metrics-daemon-gk4zn\" (UID: \"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc\") " pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:19.485484 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:19.485052 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:19.485484 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:19.485115 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs podName:42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc nodeName:}" failed. No retries permitted until 2026-04-22 19:06:21.485096107 +0000 UTC m=+6.165573948 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs") pod "network-metrics-daemon-gk4zn" (UID: "42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:19.880179 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:19.879697 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:19.880179 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:19.879827 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gk4zn" podUID="42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc" Apr 22 19:06:19.883248 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:19.880850 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:19.883248 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:19.880948 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nfm6p" podUID="de257b4d-6fda-4abf-96a2-b516950ed9ef" Apr 22 19:06:19.939095 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:19.937957 2576 generic.go:358] "Generic (PLEG): container finished" podID="41b31a6aac0e1ab38c6e962369a1223f" containerID="decb74f146d7536ce6b52540b080a5dfd9cbf79dc7ca2d64bfa97c4dc1799f75" exitCode=0 Apr 22 19:06:19.939095 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:19.938817 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal" event={"ID":"41b31a6aac0e1ab38c6e962369a1223f","Type":"ContainerDied","Data":"decb74f146d7536ce6b52540b080a5dfd9cbf79dc7ca2d64bfa97c4dc1799f75"} Apr 22 19:06:19.958529 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:19.958106 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-191.ec2.internal" podStartSLOduration=3.958088806 podStartE2EDuration="3.958088806s" podCreationTimestamp="2026-04-22 19:06:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:06:18.903772729 +0000 UTC m=+3.584250585" watchObservedRunningTime="2026-04-22 19:06:19.958088806 +0000 UTC m=+4.638566663" Apr 22 19:06:20.950475 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:20.949739 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal" event={"ID":"41b31a6aac0e1ab38c6e962369a1223f","Type":"ContainerStarted","Data":"a57d1adbf3d94b43d5fcc77c5e20b71fd95d1b081be4e74daea0993bb5bd2753"} Apr 22 19:06:20.966439 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:20.966283 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-191.ec2.internal" podStartSLOduration=4.966263426 podStartE2EDuration="4.966263426s" podCreationTimestamp="2026-04-22 19:06:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:06:20.965777163 +0000 UTC m=+5.646255019" watchObservedRunningTime="2026-04-22 19:06:20.966263426 +0000 UTC m=+5.646741282" Apr 22 19:06:21.405844 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:21.405761 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vqxc\" (UniqueName: \"kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc\") pod \"network-check-target-nfm6p\" (UID: \"de257b4d-6fda-4abf-96a2-b516950ed9ef\") " pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:21.406006 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:21.405928 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:21.406006 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:21.405947 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:21.406006 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:21.405958 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7vqxc for pod openshift-network-diagnostics/network-check-target-nfm6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:21.406177 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:21.406018 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc podName:de257b4d-6fda-4abf-96a2-b516950ed9ef nodeName:}" failed. No retries permitted until 2026-04-22 19:06:25.406000146 +0000 UTC m=+10.086477984 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7vqxc" (UniqueName: "kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc") pod "network-check-target-nfm6p" (UID: "de257b4d-6fda-4abf-96a2-b516950ed9ef") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:21.506283 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:21.506214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs\") pod \"network-metrics-daemon-gk4zn\" (UID: \"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc\") " pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:21.506454 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:21.506391 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:21.506523 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:21.506460 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs podName:42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc nodeName:}" failed. No retries permitted until 2026-04-22 19:06:25.506441204 +0000 UTC m=+10.186919043 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs") pod "network-metrics-daemon-gk4zn" (UID: "42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:21.880426 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:21.879905 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:21.880426 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:21.880051 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:21.880426 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:21.880128 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nfm6p" podUID="de257b4d-6fda-4abf-96a2-b516950ed9ef" Apr 22 19:06:21.880426 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:21.880045 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gk4zn" podUID="42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc" Apr 22 19:06:23.879253 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:23.879225 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:23.879783 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:23.879351 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nfm6p" podUID="de257b4d-6fda-4abf-96a2-b516950ed9ef" Apr 22 19:06:23.879783 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:23.879440 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:23.879783 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:23.879575 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gk4zn" podUID="42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc" Apr 22 19:06:25.434401 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:25.434366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vqxc\" (UniqueName: \"kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc\") pod \"network-check-target-nfm6p\" (UID: \"de257b4d-6fda-4abf-96a2-b516950ed9ef\") " pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:25.434879 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:25.434523 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:25.434879 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:25.434561 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:25.434879 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:25.434578 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7vqxc for pod openshift-network-diagnostics/network-check-target-nfm6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:25.434879 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:25.434635 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc podName:de257b4d-6fda-4abf-96a2-b516950ed9ef nodeName:}" failed. No retries permitted until 2026-04-22 19:06:33.434616119 +0000 UTC m=+18.115093954 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7vqxc" (UniqueName: "kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc") pod "network-check-target-nfm6p" (UID: "de257b4d-6fda-4abf-96a2-b516950ed9ef") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:25.535420 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:25.535387 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs\") pod \"network-metrics-daemon-gk4zn\" (UID: \"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc\") " pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:25.535620 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:25.535595 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:25.535770 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:25.535672 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs podName:42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc nodeName:}" failed. No retries permitted until 2026-04-22 19:06:33.535649885 +0000 UTC m=+18.216127725 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs") pod "network-metrics-daemon-gk4zn" (UID: "42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:25.880343 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:25.880265 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:25.880497 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:25.880371 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nfm6p" podUID="de257b4d-6fda-4abf-96a2-b516950ed9ef" Apr 22 19:06:25.880778 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:25.880749 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:25.880876 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:25.880861 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gk4zn" podUID="42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc" Apr 22 19:06:27.879636 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:27.879536 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:27.880064 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:27.879686 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nfm6p" podUID="de257b4d-6fda-4abf-96a2-b516950ed9ef" Apr 22 19:06:27.880064 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:27.879729 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:27.880064 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:27.879807 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gk4zn" podUID="42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc" Apr 22 19:06:29.879194 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:29.879153 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:29.879661 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:29.879286 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nfm6p" podUID="de257b4d-6fda-4abf-96a2-b516950ed9ef" Apr 22 19:06:29.879661 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:29.879330 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:29.879661 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:29.879453 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gk4zn" podUID="42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc" Apr 22 19:06:31.882044 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:31.882018 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:31.882524 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:31.882018 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:31.882524 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:31.882138 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nfm6p" podUID="de257b4d-6fda-4abf-96a2-b516950ed9ef" Apr 22 19:06:31.882524 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:31.882219 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gk4zn" podUID="42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc" Apr 22 19:06:32.449300 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:32.449265 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-tksss"] Apr 22 19:06:32.458252 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:32.458224 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:32.458368 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:32.458309 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tksss" podUID="75bd4507-59f4-478b-8153-398fc3f4f109" Apr 22 19:06:32.587142 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:32.587112 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret\") pod \"global-pull-secret-syncer-tksss\" (UID: \"75bd4507-59f4-478b-8153-398fc3f4f109\") " pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:32.587306 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:32.587153 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75bd4507-59f4-478b-8153-398fc3f4f109-kubelet-config\") pod \"global-pull-secret-syncer-tksss\" (UID: \"75bd4507-59f4-478b-8153-398fc3f4f109\") " pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:32.587306 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:32.587206 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75bd4507-59f4-478b-8153-398fc3f4f109-dbus\") pod \"global-pull-secret-syncer-tksss\" (UID: \"75bd4507-59f4-478b-8153-398fc3f4f109\") " pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:32.687886 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:32.687851 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret\") pod \"global-pull-secret-syncer-tksss\" (UID: \"75bd4507-59f4-478b-8153-398fc3f4f109\") " pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:32.688038 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:32.687897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75bd4507-59f4-478b-8153-398fc3f4f109-kubelet-config\") pod \"global-pull-secret-syncer-tksss\" (UID: \"75bd4507-59f4-478b-8153-398fc3f4f109\") " pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:32.688038 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:32.687946 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75bd4507-59f4-478b-8153-398fc3f4f109-dbus\") pod \"global-pull-secret-syncer-tksss\" (UID: \"75bd4507-59f4-478b-8153-398fc3f4f109\") " pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:32.688038 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:32.688005 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:32.688201 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:32.688052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75bd4507-59f4-478b-8153-398fc3f4f109-kubelet-config\") pod \"global-pull-secret-syncer-tksss\" (UID: \"75bd4507-59f4-478b-8153-398fc3f4f109\") " pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:32.688201 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:32.688081 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret podName:75bd4507-59f4-478b-8153-398fc3f4f109 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:33.188061136 +0000 UTC m=+17.868538969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret") pod "global-pull-secret-syncer-tksss" (UID: "75bd4507-59f4-478b-8153-398fc3f4f109") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:32.688201 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:32.688097 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75bd4507-59f4-478b-8153-398fc3f4f109-dbus\") pod \"global-pull-secret-syncer-tksss\" (UID: \"75bd4507-59f4-478b-8153-398fc3f4f109\") " pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:33.192656 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:33.192619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret\") pod \"global-pull-secret-syncer-tksss\" (UID: \"75bd4507-59f4-478b-8153-398fc3f4f109\") " pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:33.193092 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:33.192774 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:33.193092 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:33.192850 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret podName:75bd4507-59f4-478b-8153-398fc3f4f109 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:34.192829123 +0000 UTC m=+18.873306962 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret") pod "global-pull-secret-syncer-tksss" (UID: "75bd4507-59f4-478b-8153-398fc3f4f109") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:33.495203 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:33.495111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vqxc\" (UniqueName: \"kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc\") pod \"network-check-target-nfm6p\" (UID: \"de257b4d-6fda-4abf-96a2-b516950ed9ef\") " pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:33.495355 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:33.495283 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:33.495355 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:33.495307 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:33.495355 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:33.495319 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7vqxc for pod openshift-network-diagnostics/network-check-target-nfm6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:33.495514 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:33.495377 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc podName:de257b4d-6fda-4abf-96a2-b516950ed9ef nodeName:}" failed. No retries permitted until 2026-04-22 19:06:49.495362621 +0000 UTC m=+34.175840454 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7vqxc" (UniqueName: "kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc") pod "network-check-target-nfm6p" (UID: "de257b4d-6fda-4abf-96a2-b516950ed9ef") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:33.596322 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:33.596281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs\") pod \"network-metrics-daemon-gk4zn\" (UID: \"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc\") " pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:33.596479 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:33.596453 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:33.596535 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:33.596525 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs podName:42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc nodeName:}" failed. No retries permitted until 2026-04-22 19:06:49.596508265 +0000 UTC m=+34.276986097 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs") pod "network-metrics-daemon-gk4zn" (UID: "42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:33.879677 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:33.879599 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:33.879677 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:33.879622 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:33.879882 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:33.879599 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:33.879882 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:33.879722 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gk4zn" podUID="42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc" Apr 22 19:06:33.879968 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:33.879883 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tksss" podUID="75bd4507-59f4-478b-8153-398fc3f4f109" Apr 22 19:06:33.879968 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:33.879957 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nfm6p" podUID="de257b4d-6fda-4abf-96a2-b516950ed9ef" Apr 22 19:06:34.201957 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:34.201875 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret\") pod \"global-pull-secret-syncer-tksss\" (UID: \"75bd4507-59f4-478b-8153-398fc3f4f109\") " pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:34.202309 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:34.202037 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:34.202309 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:34.202108 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret podName:75bd4507-59f4-478b-8153-398fc3f4f109 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:36.202091432 +0000 UTC m=+20.882569266 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret") pod "global-pull-secret-syncer-tksss" (UID: "75bd4507-59f4-478b-8153-398fc3f4f109") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:35.882438 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:35.882307 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:35.883973 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:35.882314 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:35.883973 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:35.882376 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:35.883973 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:35.882590 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gk4zn" podUID="42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc" Apr 22 19:06:35.883973 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:35.882693 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tksss" podUID="75bd4507-59f4-478b-8153-398fc3f4f109" Apr 22 19:06:35.883973 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:35.882779 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nfm6p" podUID="de257b4d-6fda-4abf-96a2-b516950ed9ef" Apr 22 19:06:35.972370 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:35.972108 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" event={"ID":"192f282f-5991-460f-a756-3f4a540d048a","Type":"ContainerStarted","Data":"2527ef32a6a85a6b668ee15683211424c994ba3f808d6d40cb73845f72343b3c"} Apr 22 19:06:35.973457 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:35.973424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sgjxl" event={"ID":"0d1ae74c-067a-489d-a7eb-707cf1b181a7","Type":"ContainerStarted","Data":"0bf99e18a262b21d72c9d2b4c9b6fc18daf9b97b311900ed5854f4da825eb318"} Apr 22 19:06:35.975095 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:35.975066 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sqb4x" event={"ID":"e0f709c0-fc28-4eab-9cf8-603681f7f300","Type":"ContainerStarted","Data":"146c8032b53778a76f9ebb0b713be7039acaa99ab131bf6fe0eecc1f3b231c45"} Apr 22 19:06:35.976672 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:35.976645 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" event={"ID":"59064f5c-f3de-4cf6-b0ff-2f4ffea972d8","Type":"ContainerStarted","Data":"da3af04a67756947e4e281b06020e7e6107e4b088173998f92f5be79bb9d328e"} Apr 22 19:06:35.978158 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:35.978115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-z48vt" event={"ID":"677eb918-a02c-47e7-853e-5d091e94e4e3","Type":"ContainerStarted","Data":"dd073885cc7aeb150e4abe70da5d4abf75536c9e225d8ac695f01572e1bffdab"} Apr 22 19:06:35.982253 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:35.982230 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ktwxr" event={"ID":"8b12af7b-06a9-4788-95b8-dc94a26738fe","Type":"ContainerStarted","Data":"c2ecba30da21d39918b1a4176af9aa04ec0a106bf1c8b98283b316872243c2ff"} Apr 22 19:06:35.983898 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:35.983864 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9gplk" event={"ID":"eb917968-4a52-4305-8b42-7cfc0d5bf83c","Type":"ContainerStarted","Data":"bab01a2890831dd620a35a723fff9a3c9def48b756ea283ce0a42ce52618dac8"} Apr 22 19:06:35.986707 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:35.986687 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" event={"ID":"65c2e60e-a686-4be8-bb8d-33be235b8b32","Type":"ContainerStarted","Data":"b7641006dec9a6f8362995e60a7aa2662d76e4985bd53a52fff0429095f1f224"} Apr 22 19:06:35.986789 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:35.986712 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" event={"ID":"65c2e60e-a686-4be8-bb8d-33be235b8b32","Type":"ContainerStarted","Data":"836e3114170169bf9f563cf5697ecb5423049407009e11849b052ff5a2be5a34"} Apr 22 19:06:35.986789 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:35.986726 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" event={"ID":"65c2e60e-a686-4be8-bb8d-33be235b8b32","Type":"ContainerStarted","Data":"116dbe3c2a7c2e084a2f8bed62a745f0705c4a0625f9c471b2e604be9fe4b667"} Apr 22 19:06:35.986789 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:35.986738 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" event={"ID":"65c2e60e-a686-4be8-bb8d-33be235b8b32","Type":"ContainerStarted","Data":"45e264904619d5378b8c2c464304f0963527be466848cde4669eaeef9100d4f9"} Apr 22 19:06:36.011250 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:36.011208 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sqb4x" podStartSLOduration=4.083164012 podStartE2EDuration="21.011194106s" podCreationTimestamp="2026-04-22 19:06:15 +0000 UTC" firstStartedPulling="2026-04-22 19:06:18.495280586 +0000 UTC m=+3.175758426" lastFinishedPulling="2026-04-22 19:06:35.423310673 +0000 UTC m=+20.103788520" observedRunningTime="2026-04-22 19:06:36.010662958 +0000 UTC m=+20.691140813" watchObservedRunningTime="2026-04-22 19:06:36.011194106 +0000 UTC m=+20.691671960" Apr 22 19:06:36.027108 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:36.027065 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ktwxr" podStartSLOduration=3.85823643 podStartE2EDuration="21.027053328s" podCreationTimestamp="2026-04-22 19:06:15 +0000 UTC" firstStartedPulling="2026-04-22 19:06:18.503293151 +0000 UTC m=+3.183770987" lastFinishedPulling="2026-04-22 19:06:35.672110045 +0000 UTC m=+20.352587885" observedRunningTime="2026-04-22 19:06:36.027033061 +0000 UTC m=+20.707510919" watchObservedRunningTime="2026-04-22 19:06:36.027053328 +0000 UTC m=+20.707531161" Apr 22 19:06:36.043699 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:36.043658 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-q9hnf" podStartSLOduration=3.116387176 podStartE2EDuration="20.043646492s" podCreationTimestamp="2026-04-22 19:06:16 +0000 UTC" firstStartedPulling="2026-04-22 19:06:18.495648948 +0000 UTC m=+3.176126785" lastFinishedPulling="2026-04-22 19:06:35.422908259 +0000 UTC m=+20.103386101" observedRunningTime="2026-04-22 19:06:36.043201874 +0000 UTC m=+20.723679728" watchObservedRunningTime="2026-04-22 19:06:36.043646492 +0000 UTC m=+20.724124347" Apr 22 19:06:36.073893 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:36.073854 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-z48vt" podStartSLOduration=3.14325059 podStartE2EDuration="20.073832003s" podCreationTimestamp="2026-04-22 19:06:16 +0000 UTC" firstStartedPulling="2026-04-22 19:06:18.492330563 +0000 UTC m=+3.172808397" lastFinishedPulling="2026-04-22 19:06:35.422911966 +0000 UTC m=+20.103389810" observedRunningTime="2026-04-22 19:06:36.073742573 +0000 UTC m=+20.754220431" watchObservedRunningTime="2026-04-22 19:06:36.073832003 +0000 UTC m=+20.754309858" Apr 22 19:06:36.074009 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:36.073919 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9gplk" podStartSLOduration=4.145934688 podStartE2EDuration="21.073914886s" podCreationTimestamp="2026-04-22 19:06:15 +0000 UTC" firstStartedPulling="2026-04-22 19:06:18.500232278 +0000 UTC m=+3.180710124" lastFinishedPulling="2026-04-22 19:06:35.428212476 +0000 UTC m=+20.108690322" observedRunningTime="2026-04-22 19:06:36.059755122 +0000 UTC m=+20.740232977" watchObservedRunningTime="2026-04-22 19:06:36.073914886 +0000 UTC m=+20.754392741" Apr 22 19:06:36.218276 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:36.218200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret\") pod \"global-pull-secret-syncer-tksss\" (UID: \"75bd4507-59f4-478b-8153-398fc3f4f109\") " pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:36.218398 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:36.218323 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:36.218398 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:36.218388 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret podName:75bd4507-59f4-478b-8153-398fc3f4f109 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:40.218367552 +0000 UTC m=+24.898845385 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret") pod "global-pull-secret-syncer-tksss" (UID: "75bd4507-59f4-478b-8153-398fc3f4f109") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:36.712828 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:36.712759 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-z48vt" Apr 22 19:06:36.800244 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:36.800215 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-z48vt" Apr 22 19:06:36.800773 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:36.800757 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-z48vt" Apr 22 19:06:36.990665 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:36.990634 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" event={"ID":"65c2e60e-a686-4be8-bb8d-33be235b8b32","Type":"ContainerStarted","Data":"19a5979e42625481d03b9ef449e54fdf0dc929e0783fe39c4940cbebb13e46b1"} Apr 22 19:06:36.991048 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:36.990673 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" event={"ID":"65c2e60e-a686-4be8-bb8d-33be235b8b32","Type":"ContainerStarted","Data":"37e8b8c43c7324973e7b2a4d8214ed9abc8266a0aae326f26a742f15d1064939"} Apr 22 19:06:36.991842 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:36.991821 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h2chr" event={"ID":"80efecb9-2d19-4b94-9cd9-9207adab24a3","Type":"ContainerStarted","Data":"45c37bb2e1f151330dbb78122d856cb93f1363201a12f1fda61884db4516606a"} Apr 22 19:06:36.992938 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:36.992919 2576 generic.go:358] "Generic (PLEG): container finished" podID="0d1ae74c-067a-489d-a7eb-707cf1b181a7" containerID="0bf99e18a262b21d72c9d2b4c9b6fc18daf9b97b311900ed5854f4da825eb318" exitCode=0 Apr 22 19:06:36.993037 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:36.993016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sgjxl" event={"ID":"0d1ae74c-067a-489d-a7eb-707cf1b181a7","Type":"ContainerDied","Data":"0bf99e18a262b21d72c9d2b4c9b6fc18daf9b97b311900ed5854f4da825eb318"} Apr 22 19:06:36.994021 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:36.994005 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-z48vt" Apr 22 19:06:37.011185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:37.009021 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-h2chr" podStartSLOduration=5.091607719 podStartE2EDuration="22.008999484s" podCreationTimestamp="2026-04-22 19:06:15 +0000 UTC" firstStartedPulling="2026-04-22 19:06:18.493667765 +0000 UTC m=+3.174145614" lastFinishedPulling="2026-04-22 19:06:35.411059534 +0000 UTC m=+20.091537379" observedRunningTime="2026-04-22 19:06:37.008095856 +0000 UTC m=+21.688573711" watchObservedRunningTime="2026-04-22 19:06:37.008999484 +0000 UTC m=+21.689477337" Apr 22 19:06:37.573501 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:37.573478 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:06:37.822035 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:37.821928 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:06:37.573492883Z","UUID":"2ffe38cd-de2a-44b3-8f40-c4af57aebad2","Handler":null,"Name":"","Endpoint":""} Apr 22 19:06:37.824717 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:37.824689 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:06:37.824717 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:37.824716 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:06:37.879793 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:37.879768 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:37.879923 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:37.879768 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:37.879923 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:37.879890 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gk4zn" podUID="42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc" Apr 22 19:06:37.880041 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:37.879782 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:37.880041 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:37.879958 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tksss" podUID="75bd4507-59f4-478b-8153-398fc3f4f109" Apr 22 19:06:37.880041 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:37.880024 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nfm6p" podUID="de257b4d-6fda-4abf-96a2-b516950ed9ef" Apr 22 19:06:37.996592 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:37.996539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" event={"ID":"192f282f-5991-460f-a756-3f4a540d048a","Type":"ContainerStarted","Data":"a481baf6d82ef4c65796cdc8e6dfc5d80b12bfb7673cf89eef700d5993a00d57"} Apr 22 19:06:39.000712 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:39.000476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" event={"ID":"192f282f-5991-460f-a756-3f4a540d048a","Type":"ContainerStarted","Data":"0ac4f0462b6c314f2b0a3642de0d9d9b1b6777dadab90709e310b9392495dfe0"} Apr 22 19:06:39.003758 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:39.003732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" event={"ID":"65c2e60e-a686-4be8-bb8d-33be235b8b32","Type":"ContainerStarted","Data":"5d2922f2b78c88b68a1478e98c0484c3ad1013c98bcd7ee1109920fd40f48bd1"} Apr 22 19:06:39.017939 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:39.017897 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-67262" podStartSLOduration=4.025260244 podStartE2EDuration="24.017884622s" podCreationTimestamp="2026-04-22 19:06:15 +0000 UTC" firstStartedPulling="2026-04-22 19:06:18.502316791 +0000 UTC m=+3.182794624" lastFinishedPulling="2026-04-22 19:06:38.494941165 +0000 UTC m=+23.175419002" observedRunningTime="2026-04-22 19:06:39.016306984 +0000 UTC m=+23.696784836" watchObservedRunningTime="2026-04-22 19:06:39.017884622 +0000 UTC m=+23.698362478" Apr 22 19:06:39.879468 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:39.879433 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:39.879664 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:39.879433 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:39.879664 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:39.879592 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:39.879664 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:39.879597 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tksss" podUID="75bd4507-59f4-478b-8153-398fc3f4f109" Apr 22 19:06:39.879825 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:39.879665 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nfm6p" podUID="de257b4d-6fda-4abf-96a2-b516950ed9ef" Apr 22 19:06:39.879825 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:39.879748 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gk4zn" podUID="42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc" Apr 22 19:06:40.250238 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:40.250209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret\") pod \"global-pull-secret-syncer-tksss\" (UID: \"75bd4507-59f4-478b-8153-398fc3f4f109\") " pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:40.250781 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:40.250372 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:40.250781 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:40.250450 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret podName:75bd4507-59f4-478b-8153-398fc3f4f109 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:48.250427368 +0000 UTC m=+32.930905400 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret") pod "global-pull-secret-syncer-tksss" (UID: "75bd4507-59f4-478b-8153-398fc3f4f109") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:41.879867 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:41.879685 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:41.880409 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:41.879733 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:41.880409 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:41.879945 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tksss" podUID="75bd4507-59f4-478b-8153-398fc3f4f109" Apr 22 19:06:41.880409 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:41.879749 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:41.880409 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:41.880021 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gk4zn" podUID="42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc" Apr 22 19:06:41.880409 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:41.880131 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nfm6p" podUID="de257b4d-6fda-4abf-96a2-b516950ed9ef" Apr 22 19:06:42.009682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:42.009657 2576 generic.go:358] "Generic (PLEG): container finished" podID="0d1ae74c-067a-489d-a7eb-707cf1b181a7" containerID="5eb1e4033ed7933ab380cd80f6cc8fdae3ae94f0c86f914db0e4331ea6722b27" exitCode=0 Apr 22 19:06:42.009817 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:42.009740 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sgjxl" event={"ID":"0d1ae74c-067a-489d-a7eb-707cf1b181a7","Type":"ContainerDied","Data":"5eb1e4033ed7933ab380cd80f6cc8fdae3ae94f0c86f914db0e4331ea6722b27"} Apr 22 19:06:42.012605 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:42.012585 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" event={"ID":"65c2e60e-a686-4be8-bb8d-33be235b8b32","Type":"ContainerStarted","Data":"a77bc1c4aa7b6ab58715640e815ba0bd9c9a5b3582d995042d04f2fca634e60f"} Apr 22 19:06:42.012860 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:42.012843 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:42.012860 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:42.012862 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:42.027349 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:42.027307 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:42.056220 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:42.056188 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" podStartSLOduration=9.906582599 podStartE2EDuration="27.056177742s" podCreationTimestamp="2026-04-22 19:06:15 +0000 UTC" firstStartedPulling="2026-04-22 19:06:18.49690728 +0000 UTC m=+3.177385125" lastFinishedPulling="2026-04-22 19:06:35.646502423 +0000 UTC m=+20.326980268" observedRunningTime="2026-04-22 19:06:42.05569551 +0000 UTC m=+26.736173385" watchObservedRunningTime="2026-04-22 19:06:42.056177742 +0000 UTC m=+26.736655597" Apr 22 19:06:43.015176 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:43.015149 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:43.028360 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:43.028333 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:06:43.099969 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:43.099944 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nfm6p"] Apr 22 19:06:43.100081 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:43.100041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:43.100144 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:43.100128 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nfm6p" podUID="de257b4d-6fda-4abf-96a2-b516950ed9ef" Apr 22 19:06:43.103500 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:43.103475 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gk4zn"] Apr 22 19:06:43.103611 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:43.103603 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:43.103733 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:43.103705 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gk4zn" podUID="42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc" Apr 22 19:06:43.104121 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:43.104099 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tksss"] Apr 22 19:06:43.104226 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:43.104213 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:43.104332 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:43.104307 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tksss" podUID="75bd4507-59f4-478b-8153-398fc3f4f109" Apr 22 19:06:44.017883 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:44.017715 2576 generic.go:358] "Generic (PLEG): container finished" podID="0d1ae74c-067a-489d-a7eb-707cf1b181a7" containerID="fc8060e680f058a2e91fb088cf27389ea87cbd87927498c25de51ddaa41ad3c2" exitCode=0 Apr 22 19:06:44.018210 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:44.017782 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sgjxl" event={"ID":"0d1ae74c-067a-489d-a7eb-707cf1b181a7","Type":"ContainerDied","Data":"fc8060e680f058a2e91fb088cf27389ea87cbd87927498c25de51ddaa41ad3c2"} Apr 22 19:06:44.879948 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:44.879926 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:44.880091 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:44.879925 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:44.880155 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:44.879925 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:44.880155 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:44.880127 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gk4zn" podUID="42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc" Apr 22 19:06:44.880242 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:44.880153 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tksss" podUID="75bd4507-59f4-478b-8153-398fc3f4f109" Apr 22 19:06:44.880242 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:44.880013 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nfm6p" podUID="de257b4d-6fda-4abf-96a2-b516950ed9ef" Apr 22 19:06:45.021758 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:45.021732 2576 generic.go:358] "Generic (PLEG): container finished" podID="0d1ae74c-067a-489d-a7eb-707cf1b181a7" containerID="b073198128fb443ece1f5f20de46efc3d48f9750679472a59914ddd76a9cfb29" exitCode=0 Apr 22 19:06:45.022169 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:45.021822 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sgjxl" event={"ID":"0d1ae74c-067a-489d-a7eb-707cf1b181a7","Type":"ContainerDied","Data":"b073198128fb443ece1f5f20de46efc3d48f9750679472a59914ddd76a9cfb29"} Apr 22 19:06:46.879386 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:46.879313 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:46.879921 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:46.879313 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:46.879921 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:46.879457 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gk4zn" podUID="42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc" Apr 22 19:06:46.879921 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:46.879313 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:46.879921 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:46.879518 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tksss" podUID="75bd4507-59f4-478b-8153-398fc3f4f109" Apr 22 19:06:46.879921 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:46.879599 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nfm6p" podUID="de257b4d-6fda-4abf-96a2-b516950ed9ef" Apr 22 19:06:48.316789 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.316752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret\") pod \"global-pull-secret-syncer-tksss\" (UID: \"75bd4507-59f4-478b-8153-398fc3f4f109\") " pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:48.317326 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:48.316918 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:48.317326 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:48.317000 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret podName:75bd4507-59f4-478b-8153-398fc3f4f109 nodeName:}" failed. No retries permitted until 2026-04-22 19:07:04.31697584 +0000 UTC m=+48.997453679 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret") pod "global-pull-secret-syncer-tksss" (UID: "75bd4507-59f4-478b-8153-398fc3f4f109") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:48.627648 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.627611 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-191.ec2.internal" event="NodeReady" Apr 22 19:06:48.627837 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.627723 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:06:48.669538 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.669509 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-467kd"] Apr 22 19:06:48.688317 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.688291 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-772dj"] Apr 22 19:06:48.688606 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.688572 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-467kd" Apr 22 19:06:48.691034 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.691012 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:06:48.691299 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.691279 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:06:48.691522 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.691506 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2cl86\"" Apr 22 19:06:48.700359 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.700340 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-467kd"] Apr 22 19:06:48.700459 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.700363 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-772dj"] Apr 22 19:06:48.700518 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.700460 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:06:48.702843 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.702586 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:06:48.702843 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.702729 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lr5f4\"" Apr 22 19:06:48.702843 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.702739 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:06:48.702843 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.702785 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:06:48.819692 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.819651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert\") pod \"ingress-canary-772dj\" (UID: \"0c333e53-81f1-4b5a-91a4-6aad9cbe63aa\") " pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:06:48.819692 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.819697 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqb8z\" (UniqueName: \"kubernetes.io/projected/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-kube-api-access-vqb8z\") pod \"ingress-canary-772dj\" (UID: \"0c333e53-81f1-4b5a-91a4-6aad9cbe63aa\") " pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:06:48.819984 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.819723 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f0879ed5-18cc-4265-8956-15d1b97cade2-tmp-dir\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:06:48.819984 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.819752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsrmg\" (UniqueName: \"kubernetes.io/projected/f0879ed5-18cc-4265-8956-15d1b97cade2-kube-api-access-hsrmg\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:06:48.819984 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.819817 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0879ed5-18cc-4265-8956-15d1b97cade2-config-volume\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:06:48.819984 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.819870 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:06:48.879590 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.879490 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:06:48.879590 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.879569 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:48.879866 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.879789 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:48.882640 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.882612 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:06:48.882640 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.882620 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:06:48.882821 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.882708 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:06:48.882821 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.882764 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6t9wx\"" Apr 22 19:06:48.882926 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.882914 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zjzgr\"" Apr 22 19:06:48.883000 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.882979 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:06:48.920228 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.920205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert\") pod \"ingress-canary-772dj\" (UID: \"0c333e53-81f1-4b5a-91a4-6aad9cbe63aa\") " pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:06:48.920317 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.920231 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqb8z\" (UniqueName: \"kubernetes.io/projected/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-kube-api-access-vqb8z\") pod \"ingress-canary-772dj\" (UID: \"0c333e53-81f1-4b5a-91a4-6aad9cbe63aa\") " pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:06:48.920317 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.920250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f0879ed5-18cc-4265-8956-15d1b97cade2-tmp-dir\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:06:48.920423 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:48.920328 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:06:48.920423 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.920376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsrmg\" (UniqueName: \"kubernetes.io/projected/f0879ed5-18cc-4265-8956-15d1b97cade2-kube-api-access-hsrmg\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:06:48.920423 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:48.920383 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert podName:0c333e53-81f1-4b5a-91a4-6aad9cbe63aa nodeName:}" failed. No retries permitted until 2026-04-22 19:06:49.420366408 +0000 UTC m=+34.100844247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert") pod "ingress-canary-772dj" (UID: "0c333e53-81f1-4b5a-91a4-6aad9cbe63aa") : secret "canary-serving-cert" not found Apr 22 19:06:48.920595 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.920436 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0879ed5-18cc-4265-8956-15d1b97cade2-config-volume\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:06:48.920595 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.920477 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f0879ed5-18cc-4265-8956-15d1b97cade2-tmp-dir\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:06:48.920595 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.920495 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:06:48.920752 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:48.920620 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:06:48.920752 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:48.920681 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls podName:f0879ed5-18cc-4265-8956-15d1b97cade2 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:49.420665245 +0000 UTC m=+34.101143090 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls") pod "dns-default-467kd" (UID: "f0879ed5-18cc-4265-8956-15d1b97cade2") : secret "dns-default-metrics-tls" not found Apr 22 19:06:48.920965 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.920947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0879ed5-18cc-4265-8956-15d1b97cade2-config-volume\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:06:48.930896 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.930873 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsrmg\" (UniqueName: \"kubernetes.io/projected/f0879ed5-18cc-4265-8956-15d1b97cade2-kube-api-access-hsrmg\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:06:48.931078 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:48.931058 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqb8z\" (UniqueName: \"kubernetes.io/projected/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-kube-api-access-vqb8z\") pod \"ingress-canary-772dj\" (UID: \"0c333e53-81f1-4b5a-91a4-6aad9cbe63aa\") " pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:06:49.425359 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:49.425316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert\") pod \"ingress-canary-772dj\" (UID: \"0c333e53-81f1-4b5a-91a4-6aad9cbe63aa\") " pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:06:49.425953 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:49.425401 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:06:49.425953 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:49.425465 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:06:49.425953 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:49.425505 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:06:49.425953 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:49.425541 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert podName:0c333e53-81f1-4b5a-91a4-6aad9cbe63aa nodeName:}" failed. No retries permitted until 2026-04-22 19:06:50.425520012 +0000 UTC m=+35.105997867 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert") pod "ingress-canary-772dj" (UID: "0c333e53-81f1-4b5a-91a4-6aad9cbe63aa") : secret "canary-serving-cert" not found Apr 22 19:06:49.425953 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:49.425583 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls podName:f0879ed5-18cc-4265-8956-15d1b97cade2 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:50.425571808 +0000 UTC m=+35.106049646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls") pod "dns-default-467kd" (UID: "f0879ed5-18cc-4265-8956-15d1b97cade2") : secret "dns-default-metrics-tls" not found Apr 22 19:06:49.526593 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:49.526542 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vqxc\" (UniqueName: \"kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc\") pod \"network-check-target-nfm6p\" (UID: \"de257b4d-6fda-4abf-96a2-b516950ed9ef\") " pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:49.529420 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:49.529396 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vqxc\" (UniqueName: \"kubernetes.io/projected/de257b4d-6fda-4abf-96a2-b516950ed9ef-kube-api-access-7vqxc\") pod \"network-check-target-nfm6p\" (UID: \"de257b4d-6fda-4abf-96a2-b516950ed9ef\") " pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:49.627081 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:49.627050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs\") pod \"network-metrics-daemon-gk4zn\" (UID: \"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc\") " pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:06:49.627234 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:49.627201 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:06:49.627309 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:49.627276 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs podName:42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc nodeName:}" failed. No retries permitted until 2026-04-22 19:07:21.627256135 +0000 UTC m=+66.307733976 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs") pod "network-metrics-daemon-gk4zn" (UID: "42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc") : secret "metrics-daemon-secret" not found Apr 22 19:06:49.790412 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:49.790381 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:50.432459 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:50.432428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert\") pod \"ingress-canary-772dj\" (UID: \"0c333e53-81f1-4b5a-91a4-6aad9cbe63aa\") " pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:06:50.432832 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:50.432489 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:06:50.432832 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:50.432590 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:06:50.432832 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:50.432649 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert podName:0c333e53-81f1-4b5a-91a4-6aad9cbe63aa nodeName:}" failed. No retries permitted until 2026-04-22 19:06:52.432632874 +0000 UTC m=+37.113110713 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert") pod "ingress-canary-772dj" (UID: "0c333e53-81f1-4b5a-91a4-6aad9cbe63aa") : secret "canary-serving-cert" not found Apr 22 19:06:50.432832 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:50.432590 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:06:50.432832 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:50.432684 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls podName:f0879ed5-18cc-4265-8956-15d1b97cade2 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:52.432675595 +0000 UTC m=+37.113153439 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls") pod "dns-default-467kd" (UID: "f0879ed5-18cc-4265-8956-15d1b97cade2") : secret "dns-default-metrics-tls" not found Apr 22 19:06:50.657246 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:50.657089 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nfm6p"] Apr 22 19:06:50.721918 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:06:50.721841 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde257b4d_6fda_4abf_96a2_b516950ed9ef.slice/crio-f870fbe3be94372f897a41471cd4a19c0464c56c1ec08b8354cd39a0fea7f78a WatchSource:0}: Error finding container f870fbe3be94372f897a41471cd4a19c0464c56c1ec08b8354cd39a0fea7f78a: Status 404 returned error can't find the container with id f870fbe3be94372f897a41471cd4a19c0464c56c1ec08b8354cd39a0fea7f78a Apr 22 19:06:51.036211 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:51.036182 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sgjxl" event={"ID":"0d1ae74c-067a-489d-a7eb-707cf1b181a7","Type":"ContainerStarted","Data":"f11d95103cb481e128b3d5b9a8b903965bb9cbfe0f710176109adf6dfb732511"} Apr 22 19:06:51.037133 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:51.037110 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nfm6p" event={"ID":"de257b4d-6fda-4abf-96a2-b516950ed9ef","Type":"ContainerStarted","Data":"f870fbe3be94372f897a41471cd4a19c0464c56c1ec08b8354cd39a0fea7f78a"} Apr 22 19:06:52.041610 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:52.041575 2576 generic.go:358] "Generic (PLEG): container finished" podID="0d1ae74c-067a-489d-a7eb-707cf1b181a7" containerID="f11d95103cb481e128b3d5b9a8b903965bb9cbfe0f710176109adf6dfb732511" exitCode=0 Apr 22 19:06:52.041610 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:52.041624 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sgjxl" event={"ID":"0d1ae74c-067a-489d-a7eb-707cf1b181a7","Type":"ContainerDied","Data":"f11d95103cb481e128b3d5b9a8b903965bb9cbfe0f710176109adf6dfb732511"} Apr 22 19:06:52.445612 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:52.445534 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert\") pod \"ingress-canary-772dj\" (UID: \"0c333e53-81f1-4b5a-91a4-6aad9cbe63aa\") " pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:06:52.445778 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:52.445629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:06:52.445778 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:52.445672 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:06:52.445778 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:52.445725 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:06:52.445778 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:52.445734 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert podName:0c333e53-81f1-4b5a-91a4-6aad9cbe63aa nodeName:}" failed. No retries permitted until 2026-04-22 19:06:56.445719287 +0000 UTC m=+41.126197120 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert") pod "ingress-canary-772dj" (UID: "0c333e53-81f1-4b5a-91a4-6aad9cbe63aa") : secret "canary-serving-cert" not found Apr 22 19:06:52.445963 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:52.445792 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls podName:f0879ed5-18cc-4265-8956-15d1b97cade2 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:56.445773742 +0000 UTC m=+41.126251575 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls") pod "dns-default-467kd" (UID: "f0879ed5-18cc-4265-8956-15d1b97cade2") : secret "dns-default-metrics-tls" not found Apr 22 19:06:53.046537 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:53.046504 2576 generic.go:358] "Generic (PLEG): container finished" podID="0d1ae74c-067a-489d-a7eb-707cf1b181a7" containerID="5c4723bfdbcfd76d4b480cad76172ef5f03be6fb8691bb6b68d3d21cca261350" exitCode=0 Apr 22 19:06:53.046975 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:53.046594 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sgjxl" event={"ID":"0d1ae74c-067a-489d-a7eb-707cf1b181a7","Type":"ContainerDied","Data":"5c4723bfdbcfd76d4b480cad76172ef5f03be6fb8691bb6b68d3d21cca261350"} Apr 22 19:06:54.050843 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:54.050677 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sgjxl" event={"ID":"0d1ae74c-067a-489d-a7eb-707cf1b181a7","Type":"ContainerStarted","Data":"c4974c6683b16353e66c510318427d63376210268285ed30fdff41543d9dd7ae"} Apr 22 19:06:54.072133 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:54.072087 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sgjxl" podStartSLOduration=5.821180082 podStartE2EDuration="38.072075808s" podCreationTimestamp="2026-04-22 19:06:16 +0000 UTC" firstStartedPulling="2026-04-22 19:06:18.501571238 +0000 UTC m=+3.182049084" lastFinishedPulling="2026-04-22 19:06:50.752466974 +0000 UTC m=+35.432944810" observedRunningTime="2026-04-22 19:06:54.070440087 +0000 UTC m=+38.750917934" watchObservedRunningTime="2026-04-22 19:06:54.072075808 +0000 UTC m=+38.752553662" Apr 22 19:06:55.053397 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:55.053362 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nfm6p" event={"ID":"de257b4d-6fda-4abf-96a2-b516950ed9ef","Type":"ContainerStarted","Data":"d1c46b033d734dfb33cef4a7f4fd8c87e203510bcda4e575d6084b3029c43f71"} Apr 22 19:06:55.053919 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:55.053897 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:06:55.068822 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:55.068779 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-nfm6p" podStartSLOduration=36.724889882 podStartE2EDuration="40.068766926s" podCreationTimestamp="2026-04-22 19:06:15 +0000 UTC" firstStartedPulling="2026-04-22 19:06:50.730120001 +0000 UTC m=+35.410597835" lastFinishedPulling="2026-04-22 19:06:54.073997042 +0000 UTC m=+38.754474879" observedRunningTime="2026-04-22 19:06:55.068019903 +0000 UTC m=+39.748497759" watchObservedRunningTime="2026-04-22 19:06:55.068766926 +0000 UTC m=+39.749244781" Apr 22 19:06:56.475136 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:56.475101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert\") pod \"ingress-canary-772dj\" (UID: \"0c333e53-81f1-4b5a-91a4-6aad9cbe63aa\") " pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:06:56.475486 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:06:56.475156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:06:56.475486 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:56.475235 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:06:56.475486 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:56.475237 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:06:56.475486 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:56.475285 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls podName:f0879ed5-18cc-4265-8956-15d1b97cade2 nodeName:}" failed. No retries permitted until 2026-04-22 19:07:04.475271679 +0000 UTC m=+49.155749512 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls") pod "dns-default-467kd" (UID: "f0879ed5-18cc-4265-8956-15d1b97cade2") : secret "dns-default-metrics-tls" not found Apr 22 19:06:56.475486 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:06:56.475298 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert podName:0c333e53-81f1-4b5a-91a4-6aad9cbe63aa nodeName:}" failed. No retries permitted until 2026-04-22 19:07:04.475292269 +0000 UTC m=+49.155770102 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert") pod "ingress-canary-772dj" (UID: "0c333e53-81f1-4b5a-91a4-6aad9cbe63aa") : secret "canary-serving-cert" not found Apr 22 19:07:04.330885 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:04.330850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret\") pod \"global-pull-secret-syncer-tksss\" (UID: \"75bd4507-59f4-478b-8153-398fc3f4f109\") " pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:07:04.337836 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:04.337810 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75bd4507-59f4-478b-8153-398fc3f4f109-original-pull-secret\") pod \"global-pull-secret-syncer-tksss\" (UID: \"75bd4507-59f4-478b-8153-398fc3f4f109\") " pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:07:04.502231 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:04.502199 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tksss" Apr 22 19:07:04.532097 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:04.532066 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:07:04.532229 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:04.532148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert\") pod \"ingress-canary-772dj\" (UID: \"0c333e53-81f1-4b5a-91a4-6aad9cbe63aa\") " pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:07:04.532229 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:07:04.532195 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:07:04.532308 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:07:04.532241 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:07:04.532308 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:07:04.532263 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls podName:f0879ed5-18cc-4265-8956-15d1b97cade2 nodeName:}" failed. No retries permitted until 2026-04-22 19:07:20.532243157 +0000 UTC m=+65.212720995 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls") pod "dns-default-467kd" (UID: "f0879ed5-18cc-4265-8956-15d1b97cade2") : secret "dns-default-metrics-tls" not found Apr 22 19:07:04.532308 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:07:04.532280 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert podName:0c333e53-81f1-4b5a-91a4-6aad9cbe63aa nodeName:}" failed. No retries permitted until 2026-04-22 19:07:20.53227112 +0000 UTC m=+65.212748971 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert") pod "ingress-canary-772dj" (UID: "0c333e53-81f1-4b5a-91a4-6aad9cbe63aa") : secret "canary-serving-cert" not found Apr 22 19:07:04.612775 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:04.612715 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tksss"] Apr 22 19:07:05.073625 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:05.073579 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tksss" event={"ID":"75bd4507-59f4-478b-8153-398fc3f4f109","Type":"ContainerStarted","Data":"2399cc8b8d380ec1098d82a31a8ee42a03495c7e76fc83b56e78f87aa3d1d442"} Apr 22 19:07:09.084621 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:09.084584 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tksss" event={"ID":"75bd4507-59f4-478b-8153-398fc3f4f109","Type":"ContainerStarted","Data":"da7a06f0923f0be3b06696dcc5d8c4554f828f1c7da34e3324be05bb70e35a85"} Apr 22 19:07:09.101245 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:09.101201 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-tksss" podStartSLOduration=33.133543897 podStartE2EDuration="37.101188421s" podCreationTimestamp="2026-04-22 19:06:32 +0000 UTC" firstStartedPulling="2026-04-22 19:07:04.618043019 +0000 UTC m=+49.298520854" lastFinishedPulling="2026-04-22 19:07:08.585687543 +0000 UTC m=+53.266165378" observedRunningTime="2026-04-22 19:07:09.100667869 +0000 UTC m=+53.781145724" watchObservedRunningTime="2026-04-22 19:07:09.101188421 +0000 UTC m=+53.781666276" Apr 22 19:07:15.031758 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:15.031729 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7f4rv" Apr 22 19:07:19.946403 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:19.946370 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc"] Apr 22 19:07:19.961777 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:19.961754 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc"] Apr 22 19:07:19.961925 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:19.961851 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" Apr 22 19:07:19.964254 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:19.964232 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 19:07:19.964377 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:19.964256 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 19:07:19.965139 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:19.965119 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 19:07:19.965256 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:19.965144 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 19:07:20.027909 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.027878 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5"] Apr 22 19:07:20.041360 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.041328 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df7f6480-ef44-41e0-b13e-f1b3312b9aa8-tmp\") pod \"klusterlet-addon-workmgr-66cfd488f5-p8cnc\" (UID: \"df7f6480-ef44-41e0-b13e-f1b3312b9aa8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" Apr 22 19:07:20.041484 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.041400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/df7f6480-ef44-41e0-b13e-f1b3312b9aa8-klusterlet-config\") pod \"klusterlet-addon-workmgr-66cfd488f5-p8cnc\" (UID: \"df7f6480-ef44-41e0-b13e-f1b3312b9aa8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" Apr 22 19:07:20.041484 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.041464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82chv\" (UniqueName: \"kubernetes.io/projected/df7f6480-ef44-41e0-b13e-f1b3312b9aa8-kube-api-access-82chv\") pod \"klusterlet-addon-workmgr-66cfd488f5-p8cnc\" (UID: \"df7f6480-ef44-41e0-b13e-f1b3312b9aa8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" Apr 22 19:07:20.055338 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.055309 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5"] Apr 22 19:07:20.055452 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.055344 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.058089 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.058069 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 19:07:20.058211 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.058074 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 19:07:20.058270 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.058222 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 19:07:20.058353 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.058335 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 19:07:20.141784 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.141757 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82chv\" (UniqueName: \"kubernetes.io/projected/df7f6480-ef44-41e0-b13e-f1b3312b9aa8-kube-api-access-82chv\") pod \"klusterlet-addon-workmgr-66cfd488f5-p8cnc\" (UID: \"df7f6480-ef44-41e0-b13e-f1b3312b9aa8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" Apr 22 19:07:20.141936 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.141795 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xmvp\" (UniqueName: \"kubernetes.io/projected/d8553d34-d89d-4289-9d92-086b9ad6836a-kube-api-access-6xmvp\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.141936 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.141906 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df7f6480-ef44-41e0-b13e-f1b3312b9aa8-tmp\") pod \"klusterlet-addon-workmgr-66cfd488f5-p8cnc\" (UID: \"df7f6480-ef44-41e0-b13e-f1b3312b9aa8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" Apr 22 19:07:20.142004 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.141942 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d8553d34-d89d-4289-9d92-086b9ad6836a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.142004 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.141963 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d8553d34-d89d-4289-9d92-086b9ad6836a-ca\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.142070 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.142019 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d8553d34-d89d-4289-9d92-086b9ad6836a-hub\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.142070 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.142051 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d8553d34-d89d-4289-9d92-086b9ad6836a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.142131 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.142093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/df7f6480-ef44-41e0-b13e-f1b3312b9aa8-klusterlet-config\") pod \"klusterlet-addon-workmgr-66cfd488f5-p8cnc\" (UID: \"df7f6480-ef44-41e0-b13e-f1b3312b9aa8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" Apr 22 19:07:20.142131 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.142114 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d8553d34-d89d-4289-9d92-086b9ad6836a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.142293 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.142277 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df7f6480-ef44-41e0-b13e-f1b3312b9aa8-tmp\") pod \"klusterlet-addon-workmgr-66cfd488f5-p8cnc\" (UID: \"df7f6480-ef44-41e0-b13e-f1b3312b9aa8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" Apr 22 19:07:20.144359 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.144332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/df7f6480-ef44-41e0-b13e-f1b3312b9aa8-klusterlet-config\") pod \"klusterlet-addon-workmgr-66cfd488f5-p8cnc\" (UID: \"df7f6480-ef44-41e0-b13e-f1b3312b9aa8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" Apr 22 19:07:20.150503 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.150485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82chv\" (UniqueName: \"kubernetes.io/projected/df7f6480-ef44-41e0-b13e-f1b3312b9aa8-kube-api-access-82chv\") pod \"klusterlet-addon-workmgr-66cfd488f5-p8cnc\" (UID: \"df7f6480-ef44-41e0-b13e-f1b3312b9aa8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" Apr 22 19:07:20.243004 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.242985 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d8553d34-d89d-4289-9d92-086b9ad6836a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.243104 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.243011 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d8553d34-d89d-4289-9d92-086b9ad6836a-ca\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.243104 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.243040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d8553d34-d89d-4289-9d92-086b9ad6836a-hub\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.243104 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.243056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d8553d34-d89d-4289-9d92-086b9ad6836a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.243267 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.243182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d8553d34-d89d-4289-9d92-086b9ad6836a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.243328 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.243312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xmvp\" (UniqueName: \"kubernetes.io/projected/d8553d34-d89d-4289-9d92-086b9ad6836a-kube-api-access-6xmvp\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.243766 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.243739 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d8553d34-d89d-4289-9d92-086b9ad6836a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.245283 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.245262 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d8553d34-d89d-4289-9d92-086b9ad6836a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.245590 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.245571 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d8553d34-d89d-4289-9d92-086b9ad6836a-ca\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.245806 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.245787 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d8553d34-d89d-4289-9d92-086b9ad6836a-hub\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.245866 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.245815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d8553d34-d89d-4289-9d92-086b9ad6836a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.251397 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.251377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xmvp\" (UniqueName: \"kubernetes.io/projected/d8553d34-d89d-4289-9d92-086b9ad6836a-kube-api-access-6xmvp\") pod \"cluster-proxy-proxy-agent-677d9b9dfb-84pj5\" (UID: \"d8553d34-d89d-4289-9d92-086b9ad6836a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.271699 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.271681 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" Apr 22 19:07:20.379003 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.378977 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:07:20.380926 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.380904 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc"] Apr 22 19:07:20.384631 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:07:20.384602 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf7f6480_ef44_41e0_b13e_f1b3312b9aa8.slice/crio-0af3bffa63d477decac1ffa057499ded453ca9db576eb90aaba7e4ff281482cd WatchSource:0}: Error finding container 0af3bffa63d477decac1ffa057499ded453ca9db576eb90aaba7e4ff281482cd: Status 404 returned error can't find the container with id 0af3bffa63d477decac1ffa057499ded453ca9db576eb90aaba7e4ff281482cd Apr 22 19:07:20.490911 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.490885 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5"] Apr 22 19:07:20.494077 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:07:20.494020 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8553d34_d89d_4289_9d92_086b9ad6836a.slice/crio-19b1af9da032dec342c8532645dc92aa6ee640522e431773d92baba86a289155 WatchSource:0}: Error finding container 19b1af9da032dec342c8532645dc92aa6ee640522e431773d92baba86a289155: Status 404 returned error can't find the container with id 19b1af9da032dec342c8532645dc92aa6ee640522e431773d92baba86a289155 Apr 22 19:07:20.545515 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.545492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert\") pod \"ingress-canary-772dj\" (UID: \"0c333e53-81f1-4b5a-91a4-6aad9cbe63aa\") " pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:07:20.545607 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:20.545540 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:07:20.545673 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:07:20.545658 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:07:20.545673 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:07:20.545668 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:07:20.545733 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:07:20.545717 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls podName:f0879ed5-18cc-4265-8956-15d1b97cade2 nodeName:}" failed. No retries permitted until 2026-04-22 19:07:52.545702468 +0000 UTC m=+97.226180301 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls") pod "dns-default-467kd" (UID: "f0879ed5-18cc-4265-8956-15d1b97cade2") : secret "dns-default-metrics-tls" not found Apr 22 19:07:20.545733 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:07:20.545730 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert podName:0c333e53-81f1-4b5a-91a4-6aad9cbe63aa nodeName:}" failed. No retries permitted until 2026-04-22 19:07:52.545724571 +0000 UTC m=+97.226202403 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert") pod "ingress-canary-772dj" (UID: "0c333e53-81f1-4b5a-91a4-6aad9cbe63aa") : secret "canary-serving-cert" not found Apr 22 19:07:21.109575 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:21.109530 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" event={"ID":"df7f6480-ef44-41e0-b13e-f1b3312b9aa8","Type":"ContainerStarted","Data":"0af3bffa63d477decac1ffa057499ded453ca9db576eb90aaba7e4ff281482cd"} Apr 22 19:07:21.110299 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:21.110276 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" event={"ID":"d8553d34-d89d-4289-9d92-086b9ad6836a","Type":"ContainerStarted","Data":"19b1af9da032dec342c8532645dc92aa6ee640522e431773d92baba86a289155"} Apr 22 19:07:21.654788 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:21.654750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs\") pod \"network-metrics-daemon-gk4zn\" (UID: \"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc\") " pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:07:21.654968 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:07:21.654944 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:07:21.655027 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:07:21.655006 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs podName:42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc nodeName:}" failed. No retries permitted until 2026-04-22 19:08:25.654987586 +0000 UTC m=+130.335465421 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs") pod "network-metrics-daemon-gk4zn" (UID: "42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc") : secret "metrics-daemon-secret" not found Apr 22 19:07:25.120861 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:25.120821 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" event={"ID":"df7f6480-ef44-41e0-b13e-f1b3312b9aa8","Type":"ContainerStarted","Data":"08b5d7f392a7cb52fc513fa93c27e7361debf987a437af0ff219e716b07623ad"} Apr 22 19:07:25.121321 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:25.121026 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" Apr 22 19:07:25.122299 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:25.122274 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" event={"ID":"d8553d34-d89d-4289-9d92-086b9ad6836a","Type":"ContainerStarted","Data":"7aca2645bbd3120dc5a6ff870a3341a6ea0e3cd7f2220d5000f10c4cf3fef478"} Apr 22 19:07:25.122624 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:25.122606 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" Apr 22 19:07:25.136278 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:25.136238 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" podStartSLOduration=1.552235697 podStartE2EDuration="6.136225979s" podCreationTimestamp="2026-04-22 19:07:19 +0000 UTC" firstStartedPulling="2026-04-22 19:07:20.38640827 +0000 UTC m=+65.066886104" lastFinishedPulling="2026-04-22 19:07:24.970398549 +0000 UTC m=+69.650876386" observedRunningTime="2026-04-22 19:07:25.13542724 +0000 UTC m=+69.815905094" watchObservedRunningTime="2026-04-22 19:07:25.136225979 +0000 UTC m=+69.816703873" Apr 22 19:07:27.059122 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:27.059094 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-nfm6p" Apr 22 19:07:28.129741 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:28.129707 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" event={"ID":"d8553d34-d89d-4289-9d92-086b9ad6836a","Type":"ContainerStarted","Data":"8da38956c807347387d3174e7ac30bf3467c9af31e1e7b9a5c32e0d905133710"} Apr 22 19:07:28.129741 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:28.129741 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" event={"ID":"d8553d34-d89d-4289-9d92-086b9ad6836a","Type":"ContainerStarted","Data":"d140adf8a1b6fbea8c8ff3dd5bb16529001393c73b0160f7ce2ae0ebfef298d2"} Apr 22 19:07:28.148149 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:28.148112 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" podStartSLOduration=1.5592080130000001 podStartE2EDuration="8.148100909s" podCreationTimestamp="2026-04-22 19:07:20 +0000 UTC" firstStartedPulling="2026-04-22 19:07:20.495783834 +0000 UTC m=+65.176261667" lastFinishedPulling="2026-04-22 19:07:27.08467673 +0000 UTC m=+71.765154563" observedRunningTime="2026-04-22 19:07:28.146714349 +0000 UTC m=+72.827192203" watchObservedRunningTime="2026-04-22 19:07:28.148100909 +0000 UTC m=+72.828578817" Apr 22 19:07:52.567102 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:52.567066 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert\") pod \"ingress-canary-772dj\" (UID: \"0c333e53-81f1-4b5a-91a4-6aad9cbe63aa\") " pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:07:52.567617 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:07:52.567121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:07:52.567617 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:07:52.567198 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:07:52.567617 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:07:52.567221 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:07:52.567617 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:07:52.567255 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert podName:0c333e53-81f1-4b5a-91a4-6aad9cbe63aa nodeName:}" failed. No retries permitted until 2026-04-22 19:08:56.567240988 +0000 UTC m=+161.247718828 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert") pod "ingress-canary-772dj" (UID: "0c333e53-81f1-4b5a-91a4-6aad9cbe63aa") : secret "canary-serving-cert" not found Apr 22 19:07:52.567617 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:07:52.567277 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls podName:f0879ed5-18cc-4265-8956-15d1b97cade2 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:56.567260376 +0000 UTC m=+161.247738209 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls") pod "dns-default-467kd" (UID: "f0879ed5-18cc-4265-8956-15d1b97cade2") : secret "dns-default-metrics-tls" not found Apr 22 19:08:25.688518 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:08:25.688464 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs\") pod \"network-metrics-daemon-gk4zn\" (UID: \"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc\") " pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:08:25.689032 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:08:25.688624 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:08:25.689032 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:08:25.688692 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs podName:42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc nodeName:}" failed. No retries permitted until 2026-04-22 19:10:27.688674772 +0000 UTC m=+252.369152606 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs") pod "network-metrics-daemon-gk4zn" (UID: "42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc") : secret "metrics-daemon-secret" not found Apr 22 19:08:49.147314 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:08:49.147289 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9gplk_eb917968-4a52-4305-8b42-7cfc0d5bf83c/dns-node-resolver/0.log" Apr 22 19:08:50.135902 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:08:50.135874 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sqb4x_e0f709c0-fc28-4eab-9cf8-603681f7f300/node-ca/0.log" Apr 22 19:08:51.699419 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:08:51.699384 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-467kd" podUID="f0879ed5-18cc-4265-8956-15d1b97cade2" Apr 22 19:08:51.710689 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:08:51.710653 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-772dj" podUID="0c333e53-81f1-4b5a-91a4-6aad9cbe63aa" Apr 22 19:08:51.897273 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:08:51.897239 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-gk4zn" podUID="42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc" Apr 22 19:08:52.313783 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:08:52.313750 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-467kd" Apr 22 19:08:56.593381 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:08:56.593343 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert\") pod \"ingress-canary-772dj\" (UID: \"0c333e53-81f1-4b5a-91a4-6aad9cbe63aa\") " pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:08:56.593932 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:08:56.593399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:08:56.593932 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:08:56.593491 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:08:56.593932 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:08:56.593576 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert podName:0c333e53-81f1-4b5a-91a4-6aad9cbe63aa nodeName:}" failed. No retries permitted until 2026-04-22 19:10:58.593542906 +0000 UTC m=+283.274020739 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert") pod "ingress-canary-772dj" (UID: "0c333e53-81f1-4b5a-91a4-6aad9cbe63aa") : secret "canary-serving-cert" not found Apr 22 19:08:56.593932 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:08:56.593496 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:08:56.593932 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:08:56.593650 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls podName:f0879ed5-18cc-4265-8956-15d1b97cade2 nodeName:}" failed. No retries permitted until 2026-04-22 19:10:58.593633269 +0000 UTC m=+283.274111102 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls") pod "dns-default-467kd" (UID: "f0879ed5-18cc-4265-8956-15d1b97cade2") : secret "dns-default-metrics-tls" not found Apr 22 19:09:05.880047 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:05.879978 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:09:05.880459 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:05.880157 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:09:09.119413 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.119381 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8z9lw"] Apr 22 19:09:09.122434 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.122415 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8z9lw" Apr 22 19:09:09.124934 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.124914 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:09:09.125046 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.124944 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:09:09.126058 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.126039 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:09:09.126142 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.126055 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:09:09.126142 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.126080 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vg9n6\"" Apr 22 19:09:09.144107 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.144082 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8z9lw"] Apr 22 19:09:09.287909 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.287878 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0739b7b9-36a6-4c2f-aafb-af6c20c38569-crio-socket\") pod \"insights-runtime-extractor-8z9lw\" (UID: \"0739b7b9-36a6-4c2f-aafb-af6c20c38569\") " pod="openshift-insights/insights-runtime-extractor-8z9lw" Apr 22 19:09:09.288045 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.287917 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0739b7b9-36a6-4c2f-aafb-af6c20c38569-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8z9lw\" (UID: \"0739b7b9-36a6-4c2f-aafb-af6c20c38569\") " pod="openshift-insights/insights-runtime-extractor-8z9lw" Apr 22 19:09:09.288045 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.287993 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0739b7b9-36a6-4c2f-aafb-af6c20c38569-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8z9lw\" (UID: \"0739b7b9-36a6-4c2f-aafb-af6c20c38569\") " pod="openshift-insights/insights-runtime-extractor-8z9lw" Apr 22 19:09:09.288045 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.288030 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0739b7b9-36a6-4c2f-aafb-af6c20c38569-data-volume\") pod \"insights-runtime-extractor-8z9lw\" (UID: \"0739b7b9-36a6-4c2f-aafb-af6c20c38569\") " pod="openshift-insights/insights-runtime-extractor-8z9lw" Apr 22 19:09:09.288168 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.288056 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58n8d\" (UniqueName: \"kubernetes.io/projected/0739b7b9-36a6-4c2f-aafb-af6c20c38569-kube-api-access-58n8d\") pod \"insights-runtime-extractor-8z9lw\" (UID: \"0739b7b9-36a6-4c2f-aafb-af6c20c38569\") " pod="openshift-insights/insights-runtime-extractor-8z9lw" Apr 22 19:09:09.388852 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.388794 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0739b7b9-36a6-4c2f-aafb-af6c20c38569-crio-socket\") pod \"insights-runtime-extractor-8z9lw\" (UID: \"0739b7b9-36a6-4c2f-aafb-af6c20c38569\") " pod="openshift-insights/insights-runtime-extractor-8z9lw" Apr 22 19:09:09.388852 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.388826 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0739b7b9-36a6-4c2f-aafb-af6c20c38569-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8z9lw\" (UID: \"0739b7b9-36a6-4c2f-aafb-af6c20c38569\") " pod="openshift-insights/insights-runtime-extractor-8z9lw" Apr 22 19:09:09.389000 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.388981 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0739b7b9-36a6-4c2f-aafb-af6c20c38569-crio-socket\") pod \"insights-runtime-extractor-8z9lw\" (UID: \"0739b7b9-36a6-4c2f-aafb-af6c20c38569\") " pod="openshift-insights/insights-runtime-extractor-8z9lw" Apr 22 19:09:09.389038 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.388991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0739b7b9-36a6-4c2f-aafb-af6c20c38569-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8z9lw\" (UID: \"0739b7b9-36a6-4c2f-aafb-af6c20c38569\") " pod="openshift-insights/insights-runtime-extractor-8z9lw" Apr 22 19:09:09.389068 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.389036 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0739b7b9-36a6-4c2f-aafb-af6c20c38569-data-volume\") pod \"insights-runtime-extractor-8z9lw\" (UID: \"0739b7b9-36a6-4c2f-aafb-af6c20c38569\") " pod="openshift-insights/insights-runtime-extractor-8z9lw" Apr 22 19:09:09.389111 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.389065 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58n8d\" (UniqueName: \"kubernetes.io/projected/0739b7b9-36a6-4c2f-aafb-af6c20c38569-kube-api-access-58n8d\") pod \"insights-runtime-extractor-8z9lw\" (UID: \"0739b7b9-36a6-4c2f-aafb-af6c20c38569\") " pod="openshift-insights/insights-runtime-extractor-8z9lw" Apr 22 19:09:09.389298 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.389284 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0739b7b9-36a6-4c2f-aafb-af6c20c38569-data-volume\") pod \"insights-runtime-extractor-8z9lw\" (UID: \"0739b7b9-36a6-4c2f-aafb-af6c20c38569\") " pod="openshift-insights/insights-runtime-extractor-8z9lw" Apr 22 19:09:09.389416 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.389391 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0739b7b9-36a6-4c2f-aafb-af6c20c38569-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8z9lw\" (UID: \"0739b7b9-36a6-4c2f-aafb-af6c20c38569\") " pod="openshift-insights/insights-runtime-extractor-8z9lw" Apr 22 19:09:09.391261 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.391245 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0739b7b9-36a6-4c2f-aafb-af6c20c38569-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8z9lw\" (UID: \"0739b7b9-36a6-4c2f-aafb-af6c20c38569\") " pod="openshift-insights/insights-runtime-extractor-8z9lw" Apr 22 19:09:09.397590 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.397570 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58n8d\" (UniqueName: \"kubernetes.io/projected/0739b7b9-36a6-4c2f-aafb-af6c20c38569-kube-api-access-58n8d\") pod \"insights-runtime-extractor-8z9lw\" (UID: \"0739b7b9-36a6-4c2f-aafb-af6c20c38569\") " pod="openshift-insights/insights-runtime-extractor-8z9lw" Apr 22 19:09:09.431319 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.431291 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8z9lw" Apr 22 19:09:09.553479 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:09.553426 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8z9lw"] Apr 22 19:09:09.557692 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:09:09.557664 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0739b7b9_36a6_4c2f_aafb_af6c20c38569.slice/crio-3e7ed9d706048f99817e582aab57aa5acc001fc98c6471c46b4265c0ae164f0f WatchSource:0}: Error finding container 3e7ed9d706048f99817e582aab57aa5acc001fc98c6471c46b4265c0ae164f0f: Status 404 returned error can't find the container with id 3e7ed9d706048f99817e582aab57aa5acc001fc98c6471c46b4265c0ae164f0f Apr 22 19:09:10.353811 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:10.353785 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8z9lw" event={"ID":"0739b7b9-36a6-4c2f-aafb-af6c20c38569","Type":"ContainerStarted","Data":"cfdb783ba6266a57acaeff3c1eef85007df47ef5ca91011398feefa0dfea761a"} Apr 22 19:09:10.354090 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:10.353819 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8z9lw" event={"ID":"0739b7b9-36a6-4c2f-aafb-af6c20c38569","Type":"ContainerStarted","Data":"3e7ed9d706048f99817e582aab57aa5acc001fc98c6471c46b4265c0ae164f0f"} Apr 22 19:09:11.358263 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:11.358225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8z9lw" event={"ID":"0739b7b9-36a6-4c2f-aafb-af6c20c38569","Type":"ContainerStarted","Data":"a65fef4d8fe017b538014f6029e6ef48c9bce6b9eb8970d477c21f2365f96d5c"} Apr 22 19:09:12.361400 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:12.361365 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8z9lw" event={"ID":"0739b7b9-36a6-4c2f-aafb-af6c20c38569","Type":"ContainerStarted","Data":"6cbc1fcb0466c9d3fce8a84bbdfb7b0e338bce3053ea6d82de361fe03a5de15c"} Apr 22 19:09:12.383610 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:12.383572 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8z9lw" podStartSLOduration=1.106167707 podStartE2EDuration="3.38354175s" podCreationTimestamp="2026-04-22 19:09:09 +0000 UTC" firstStartedPulling="2026-04-22 19:09:09.60734065 +0000 UTC m=+174.287818487" lastFinishedPulling="2026-04-22 19:09:11.884714696 +0000 UTC m=+176.565192530" observedRunningTime="2026-04-22 19:09:12.382202044 +0000 UTC m=+177.062679925" watchObservedRunningTime="2026-04-22 19:09:12.38354175 +0000 UTC m=+177.064019605" Apr 22 19:09:17.014967 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:17.014929 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vmm9h"] Apr 22 19:09:17.018114 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:17.018097 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vmm9h" Apr 22 19:09:17.021215 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:17.021188 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 19:09:17.021330 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:17.021197 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-7blq9\"" Apr 22 19:09:17.032138 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:17.032114 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vmm9h"] Apr 22 19:09:17.143338 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:17.143282 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f2533978-99d2-4933-bde6-49394145a235-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-vmm9h\" (UID: \"f2533978-99d2-4933-bde6-49394145a235\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vmm9h" Apr 22 19:09:17.243954 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:17.243914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f2533978-99d2-4933-bde6-49394145a235-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-vmm9h\" (UID: \"f2533978-99d2-4933-bde6-49394145a235\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vmm9h" Apr 22 19:09:17.246152 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:17.246134 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f2533978-99d2-4933-bde6-49394145a235-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-vmm9h\" (UID: \"f2533978-99d2-4933-bde6-49394145a235\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vmm9h" Apr 22 19:09:17.326305 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:17.326241 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vmm9h" Apr 22 19:09:17.434767 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:17.434741 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vmm9h"] Apr 22 19:09:17.437346 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:09:17.437317 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2533978_99d2_4933_bde6_49394145a235.slice/crio-7d067369a4e5523917e778d19c07f497fd35ea389852d7a06cfc7b64a2967be8 WatchSource:0}: Error finding container 7d067369a4e5523917e778d19c07f497fd35ea389852d7a06cfc7b64a2967be8: Status 404 returned error can't find the container with id 7d067369a4e5523917e778d19c07f497fd35ea389852d7a06cfc7b64a2967be8 Apr 22 19:09:18.376112 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:18.376076 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vmm9h" event={"ID":"f2533978-99d2-4933-bde6-49394145a235","Type":"ContainerStarted","Data":"7d067369a4e5523917e778d19c07f497fd35ea389852d7a06cfc7b64a2967be8"} Apr 22 19:09:19.379801 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:19.379768 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vmm9h" event={"ID":"f2533978-99d2-4933-bde6-49394145a235","Type":"ContainerStarted","Data":"a6ec5eecbd6448fa42ccc4f7dae335c329204fd1266d4c0f8359bfc2ba7653f4"} Apr 22 19:09:19.380236 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:19.379938 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vmm9h" Apr 22 19:09:19.384603 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:19.384582 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vmm9h" Apr 22 19:09:19.397398 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:19.397359 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vmm9h" podStartSLOduration=2.293112155 podStartE2EDuration="3.397346248s" podCreationTimestamp="2026-04-22 19:09:16 +0000 UTC" firstStartedPulling="2026-04-22 19:09:17.439044342 +0000 UTC m=+182.119522178" lastFinishedPulling="2026-04-22 19:09:18.54327842 +0000 UTC m=+183.223756271" observedRunningTime="2026-04-22 19:09:19.396678049 +0000 UTC m=+184.077155904" watchObservedRunningTime="2026-04-22 19:09:19.397346248 +0000 UTC m=+184.077824103" Apr 22 19:09:25.121814 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:25.121763 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" podUID="df7f6480-ef44-41e0-b13e-f1b3312b9aa8" containerName="acm-agent" probeResult="failure" output="Get \"http://10.133.0.8:8000/readyz\": dial tcp 10.133.0.8:8000: connect: connection refused" Apr 22 19:09:25.395534 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:25.395470 2576 generic.go:358] "Generic (PLEG): container finished" podID="df7f6480-ef44-41e0-b13e-f1b3312b9aa8" containerID="08b5d7f392a7cb52fc513fa93c27e7361debf987a437af0ff219e716b07623ad" exitCode=1 Apr 22 19:09:25.395534 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:25.395528 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" event={"ID":"df7f6480-ef44-41e0-b13e-f1b3312b9aa8","Type":"ContainerDied","Data":"08b5d7f392a7cb52fc513fa93c27e7361debf987a437af0ff219e716b07623ad"} Apr 22 19:09:25.395910 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:25.395885 2576 scope.go:117] "RemoveContainer" containerID="08b5d7f392a7cb52fc513fa93c27e7361debf987a437af0ff219e716b07623ad" Apr 22 19:09:26.399275 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:26.399244 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" event={"ID":"df7f6480-ef44-41e0-b13e-f1b3312b9aa8","Type":"ContainerStarted","Data":"4259fcb929cc440ff97de04914196f51368e1df63beb7c78a991eeff70617208"} Apr 22 19:09:26.399641 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:26.399519 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" Apr 22 19:09:26.400132 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:26.400113 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-66cfd488f5-p8cnc" Apr 22 19:09:28.664107 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.664080 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9"] Apr 22 19:09:28.667141 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.667122 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" Apr 22 19:09:28.670123 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.670105 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 19:09:28.670414 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.670395 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:09:28.671656 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.671637 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:09:28.671768 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.671681 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 19:09:28.671768 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.671756 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:09:28.671883 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.671858 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-9tzjw\"" Apr 22 19:09:28.679085 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.679066 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cg86r"] Apr 22 19:09:28.681881 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.681866 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.684427 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.684407 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:09:28.684427 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.684418 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mncrp\"" Apr 22 19:09:28.684562 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.684412 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:09:28.684562 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.684517 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:09:28.687488 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.687399 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9"] Apr 22 19:09:28.724915 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.724896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9918c62-5567-45be-abb1-fef6111f9bf1-metrics-client-ca\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.725015 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.724930 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-accelerators-collector-config\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.725015 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.724963 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6334ddf3-d9ef-4f14-ab90-9f695eabbfb8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-52hs9\" (UID: \"6334ddf3-d9ef-4f14-ab90-9f695eabbfb8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" Apr 22 19:09:28.725015 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.725009 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9918c62-5567-45be-abb1-fef6111f9bf1-sys\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.725123 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.725032 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-wtmp\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.725123 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.725082 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-textfile\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.725123 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.725113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a9918c62-5567-45be-abb1-fef6111f9bf1-root\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.725213 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.725165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-tls\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.725213 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.725190 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.725279 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.725210 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6334ddf3-d9ef-4f14-ab90-9f695eabbfb8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-52hs9\" (UID: \"6334ddf3-d9ef-4f14-ab90-9f695eabbfb8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" Apr 22 19:09:28.725279 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.725235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6334ddf3-d9ef-4f14-ab90-9f695eabbfb8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-52hs9\" (UID: \"6334ddf3-d9ef-4f14-ab90-9f695eabbfb8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" Apr 22 19:09:28.725279 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.725250 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p5v6\" (UniqueName: \"kubernetes.io/projected/6334ddf3-d9ef-4f14-ab90-9f695eabbfb8-kube-api-access-5p5v6\") pod \"openshift-state-metrics-9d44df66c-52hs9\" (UID: \"6334ddf3-d9ef-4f14-ab90-9f695eabbfb8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" Apr 22 19:09:28.725371 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.725286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wls57\" (UniqueName: \"kubernetes.io/projected/a9918c62-5567-45be-abb1-fef6111f9bf1-kube-api-access-wls57\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.826475 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.826453 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9918c62-5567-45be-abb1-fef6111f9bf1-metrics-client-ca\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.826596 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.826479 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-accelerators-collector-config\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.826596 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.826501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6334ddf3-d9ef-4f14-ab90-9f695eabbfb8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-52hs9\" (UID: \"6334ddf3-d9ef-4f14-ab90-9f695eabbfb8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" Apr 22 19:09:28.826596 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.826519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9918c62-5567-45be-abb1-fef6111f9bf1-sys\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.826596 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.826579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-wtmp\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.826811 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.826614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-textfile\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.826811 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.826638 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a9918c62-5567-45be-abb1-fef6111f9bf1-root\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.826811 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.826687 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a9918c62-5567-45be-abb1-fef6111f9bf1-root\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.826811 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.826736 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-wtmp\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.826811 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.826756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-tls\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.826811 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.826794 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.827089 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.826814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9918c62-5567-45be-abb1-fef6111f9bf1-sys\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.827089 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:09:28.826832 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:09:28.827089 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:09:28.826896 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-tls podName:a9918c62-5567-45be-abb1-fef6111f9bf1 nodeName:}" failed. No retries permitted until 2026-04-22 19:09:29.326875573 +0000 UTC m=+194.007353406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-tls") pod "node-exporter-cg86r" (UID: "a9918c62-5567-45be-abb1-fef6111f9bf1") : secret "node-exporter-tls" not found Apr 22 19:09:28.827089 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.827063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-textfile\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.827089 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.826825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6334ddf3-d9ef-4f14-ab90-9f695eabbfb8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-52hs9\" (UID: \"6334ddf3-d9ef-4f14-ab90-9f695eabbfb8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" Apr 22 19:09:28.827409 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.827134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6334ddf3-d9ef-4f14-ab90-9f695eabbfb8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-52hs9\" (UID: \"6334ddf3-d9ef-4f14-ab90-9f695eabbfb8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" Apr 22 19:09:28.827409 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.827164 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p5v6\" (UniqueName: \"kubernetes.io/projected/6334ddf3-d9ef-4f14-ab90-9f695eabbfb8-kube-api-access-5p5v6\") pod \"openshift-state-metrics-9d44df66c-52hs9\" (UID: \"6334ddf3-d9ef-4f14-ab90-9f695eabbfb8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" Apr 22 19:09:28.827409 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.827198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wls57\" (UniqueName: \"kubernetes.io/projected/a9918c62-5567-45be-abb1-fef6111f9bf1-kube-api-access-wls57\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.827409 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.827217 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9918c62-5567-45be-abb1-fef6111f9bf1-metrics-client-ca\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.827409 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.827235 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-accelerators-collector-config\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.827409 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.827313 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6334ddf3-d9ef-4f14-ab90-9f695eabbfb8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-52hs9\" (UID: \"6334ddf3-d9ef-4f14-ab90-9f695eabbfb8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" Apr 22 19:09:28.829268 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.829222 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6334ddf3-d9ef-4f14-ab90-9f695eabbfb8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-52hs9\" (UID: \"6334ddf3-d9ef-4f14-ab90-9f695eabbfb8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" Apr 22 19:09:28.829268 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.829227 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.829432 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.829308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6334ddf3-d9ef-4f14-ab90-9f695eabbfb8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-52hs9\" (UID: \"6334ddf3-d9ef-4f14-ab90-9f695eabbfb8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" Apr 22 19:09:28.838335 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.838310 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wls57\" (UniqueName: \"kubernetes.io/projected/a9918c62-5567-45be-abb1-fef6111f9bf1-kube-api-access-wls57\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:28.838722 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.838703 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p5v6\" (UniqueName: \"kubernetes.io/projected/6334ddf3-d9ef-4f14-ab90-9f695eabbfb8-kube-api-access-5p5v6\") pod \"openshift-state-metrics-9d44df66c-52hs9\" (UID: \"6334ddf3-d9ef-4f14-ab90-9f695eabbfb8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" Apr 22 19:09:28.976049 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:28.975992 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" Apr 22 19:09:29.089980 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:29.089912 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9"] Apr 22 19:09:29.092225 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:09:29.092200 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6334ddf3_d9ef_4f14_ab90_9f695eabbfb8.slice/crio-f6795edbaca0f95d4ef84cb5db236f40b85e3538aa01e39e285b51808bedfa8f WatchSource:0}: Error finding container f6795edbaca0f95d4ef84cb5db236f40b85e3538aa01e39e285b51808bedfa8f: Status 404 returned error can't find the container with id f6795edbaca0f95d4ef84cb5db236f40b85e3538aa01e39e285b51808bedfa8f Apr 22 19:09:29.331200 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:29.331175 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-tls\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:29.333288 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:29.333269 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a9918c62-5567-45be-abb1-fef6111f9bf1-node-exporter-tls\") pod \"node-exporter-cg86r\" (UID: \"a9918c62-5567-45be-abb1-fef6111f9bf1\") " pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:29.407011 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:29.406984 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" event={"ID":"6334ddf3-d9ef-4f14-ab90-9f695eabbfb8","Type":"ContainerStarted","Data":"77f9969614cd23edca214489d1d2046b5511a8c88eabe51e1bc8aa53f579a67c"} Apr 22 19:09:29.407011 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:29.407017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" event={"ID":"6334ddf3-d9ef-4f14-ab90-9f695eabbfb8","Type":"ContainerStarted","Data":"2cea23caccfe7d2b6cdebcc6b96fa129ecd7d8cc38284ba3515e8b82c4b4609e"} Apr 22 19:09:29.407011 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:29.407026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" event={"ID":"6334ddf3-d9ef-4f14-ab90-9f695eabbfb8","Type":"ContainerStarted","Data":"f6795edbaca0f95d4ef84cb5db236f40b85e3538aa01e39e285b51808bedfa8f"} Apr 22 19:09:29.590403 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:29.590327 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cg86r" Apr 22 19:09:29.598011 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:09:29.597987 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9918c62_5567_45be_abb1_fef6111f9bf1.slice/crio-5d717da395fa5030e3609094a9e2014d43e0ace04f36c589b3402dea6344d00d WatchSource:0}: Error finding container 5d717da395fa5030e3609094a9e2014d43e0ace04f36c589b3402dea6344d00d: Status 404 returned error can't find the container with id 5d717da395fa5030e3609094a9e2014d43e0ace04f36c589b3402dea6344d00d Apr 22 19:09:30.410472 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:30.410434 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cg86r" event={"ID":"a9918c62-5567-45be-abb1-fef6111f9bf1","Type":"ContainerStarted","Data":"5d717da395fa5030e3609094a9e2014d43e0ace04f36c589b3402dea6344d00d"} Apr 22 19:09:30.412369 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:30.412335 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" event={"ID":"6334ddf3-d9ef-4f14-ab90-9f695eabbfb8","Type":"ContainerStarted","Data":"50e670a0c09b9e3d3a1b0eb60a7c33e1bc52b691fb3c68c4bb6acd89a5ca9344"} Apr 22 19:09:30.433349 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:30.433276 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-52hs9" podStartSLOduration=1.460257275 podStartE2EDuration="2.433263111s" podCreationTimestamp="2026-04-22 19:09:28 +0000 UTC" firstStartedPulling="2026-04-22 19:09:29.205976774 +0000 UTC m=+193.886454611" lastFinishedPulling="2026-04-22 19:09:30.178982611 +0000 UTC m=+194.859460447" observedRunningTime="2026-04-22 19:09:30.432215937 +0000 UTC m=+195.112693795" watchObservedRunningTime="2026-04-22 19:09:30.433263111 +0000 UTC m=+195.113740966" Apr 22 19:09:31.416057 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:31.416024 2576 generic.go:358] "Generic (PLEG): container finished" podID="a9918c62-5567-45be-abb1-fef6111f9bf1" containerID="2ad898c961532350822b09851e89a096a39e2a78ce5e55bb212550474e273f6b" exitCode=0 Apr 22 19:09:31.416401 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:31.416113 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cg86r" event={"ID":"a9918c62-5567-45be-abb1-fef6111f9bf1","Type":"ContainerDied","Data":"2ad898c961532350822b09851e89a096a39e2a78ce5e55bb212550474e273f6b"} Apr 22 19:09:32.420394 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:32.420360 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cg86r" event={"ID":"a9918c62-5567-45be-abb1-fef6111f9bf1","Type":"ContainerStarted","Data":"899b6d2a8baded8a716912b1ed260a3a34062dee3c5b25b3a6e6076b6d286dab"} Apr 22 19:09:32.420394 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:32.420397 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cg86r" event={"ID":"a9918c62-5567-45be-abb1-fef6111f9bf1","Type":"ContainerStarted","Data":"eaa83f094d56533d34fbbd009ce420dcdb9a2e5f72815e7818e0823fb7f6c29f"} Apr 22 19:09:32.444589 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:09:32.444525 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cg86r" podStartSLOduration=3.511027418 podStartE2EDuration="4.444510039s" podCreationTimestamp="2026-04-22 19:09:28 +0000 UTC" firstStartedPulling="2026-04-22 19:09:29.599616444 +0000 UTC m=+194.280094278" lastFinishedPulling="2026-04-22 19:09:30.533099062 +0000 UTC m=+195.213576899" observedRunningTime="2026-04-22 19:09:32.442679453 +0000 UTC m=+197.123157309" watchObservedRunningTime="2026-04-22 19:09:32.444510039 +0000 UTC m=+197.124987894" Apr 22 19:10:00.380494 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:00.380457 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" podUID="d8553d34-d89d-4289-9d92-086b9ad6836a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 19:10:10.380351 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:10.380313 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" podUID="d8553d34-d89d-4289-9d92-086b9ad6836a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 19:10:20.379839 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:20.379800 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" podUID="d8553d34-d89d-4289-9d92-086b9ad6836a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 19:10:20.380193 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:20.379864 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" Apr 22 19:10:20.380307 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:20.380277 2576 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"8da38956c807347387d3174e7ac30bf3467c9af31e1e7b9a5c32e0d905133710"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 19:10:20.380342 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:20.380327 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" podUID="d8553d34-d89d-4289-9d92-086b9ad6836a" containerName="service-proxy" containerID="cri-o://8da38956c807347387d3174e7ac30bf3467c9af31e1e7b9a5c32e0d905133710" gracePeriod=30 Apr 22 19:10:20.538666 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:20.538642 2576 generic.go:358] "Generic (PLEG): container finished" podID="d8553d34-d89d-4289-9d92-086b9ad6836a" containerID="8da38956c807347387d3174e7ac30bf3467c9af31e1e7b9a5c32e0d905133710" exitCode=2 Apr 22 19:10:20.538760 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:20.538683 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" event={"ID":"d8553d34-d89d-4289-9d92-086b9ad6836a","Type":"ContainerDied","Data":"8da38956c807347387d3174e7ac30bf3467c9af31e1e7b9a5c32e0d905133710"} Apr 22 19:10:21.542974 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:21.542940 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-677d9b9dfb-84pj5" event={"ID":"d8553d34-d89d-4289-9d92-086b9ad6836a","Type":"ContainerStarted","Data":"11a4c7a702253463574f5602246831f1d89d47a8f5ebdd54610d91df473fdb52"} Apr 22 19:10:27.746905 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:27.746862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs\") pod \"network-metrics-daemon-gk4zn\" (UID: \"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc\") " pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:10:27.749058 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:27.749039 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc-metrics-certs\") pod \"network-metrics-daemon-gk4zn\" (UID: \"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc\") " pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:10:27.783291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:27.783268 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zjzgr\"" Apr 22 19:10:27.791295 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:27.791278 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gk4zn" Apr 22 19:10:27.905753 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:27.905732 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gk4zn"] Apr 22 19:10:27.907953 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:10:27.907926 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42a195d9_ce8e_4dc5_9cb1_8e23ae31d6bc.slice/crio-5a2b6254868fcc2bf768165cceeb180eb8fe25f2fa7376c9186e0750feb577af WatchSource:0}: Error finding container 5a2b6254868fcc2bf768165cceeb180eb8fe25f2fa7376c9186e0750feb577af: Status 404 returned error can't find the container with id 5a2b6254868fcc2bf768165cceeb180eb8fe25f2fa7376c9186e0750feb577af Apr 22 19:10:28.559173 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:28.559094 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gk4zn" event={"ID":"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc","Type":"ContainerStarted","Data":"5a2b6254868fcc2bf768165cceeb180eb8fe25f2fa7376c9186e0750feb577af"} Apr 22 19:10:29.562741 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:29.562699 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gk4zn" event={"ID":"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc","Type":"ContainerStarted","Data":"b503f6c70b83cd74376b5d0d8b24e276117d6987f2c10f35dbf8da3f39278431"} Apr 22 19:10:29.562741 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:29.562740 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gk4zn" event={"ID":"42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc","Type":"ContainerStarted","Data":"28d0c54f7e3932dd1caa718bef9462ca362c058452911dfe64973a07f37a703b"} Apr 22 19:10:29.591674 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:29.591632 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gk4zn" podStartSLOduration=253.485833745 podStartE2EDuration="4m14.591619537s" podCreationTimestamp="2026-04-22 19:06:15 +0000 UTC" firstStartedPulling="2026-04-22 19:10:27.909443618 +0000 UTC m=+252.589921454" lastFinishedPulling="2026-04-22 19:10:29.01522941 +0000 UTC m=+253.695707246" observedRunningTime="2026-04-22 19:10:29.590138471 +0000 UTC m=+254.270616323" watchObservedRunningTime="2026-04-22 19:10:29.591619537 +0000 UTC m=+254.272097391" Apr 22 19:10:55.314459 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:10:55.314417 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-467kd" podUID="f0879ed5-18cc-4265-8956-15d1b97cade2" Apr 22 19:10:55.630071 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:55.629995 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-467kd" Apr 22 19:10:58.655923 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:58.655893 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:10:58.655923 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:58.655929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert\") pod \"ingress-canary-772dj\" (UID: \"0c333e53-81f1-4b5a-91a4-6aad9cbe63aa\") " pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:10:58.658138 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:58.658115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0879ed5-18cc-4265-8956-15d1b97cade2-metrics-tls\") pod \"dns-default-467kd\" (UID: \"f0879ed5-18cc-4265-8956-15d1b97cade2\") " pod="openshift-dns/dns-default-467kd" Apr 22 19:10:58.658229 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:58.658163 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c333e53-81f1-4b5a-91a4-6aad9cbe63aa-cert\") pod \"ingress-canary-772dj\" (UID: \"0c333e53-81f1-4b5a-91a4-6aad9cbe63aa\") " pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:10:58.683602 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:58.683580 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lr5f4\"" Apr 22 19:10:58.691955 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:58.691940 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-772dj" Apr 22 19:10:58.816421 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:58.816386 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-772dj"] Apr 22 19:10:58.819380 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:10:58.819356 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c333e53_81f1_4b5a_91a4_6aad9cbe63aa.slice/crio-c0e57d74c1bc50d7747ef441cc2d4f78a3901c89ffd31ae4acc6347f211f52ff WatchSource:0}: Error finding container c0e57d74c1bc50d7747ef441cc2d4f78a3901c89ffd31ae4acc6347f211f52ff: Status 404 returned error can't find the container with id c0e57d74c1bc50d7747ef441cc2d4f78a3901c89ffd31ae4acc6347f211f52ff Apr 22 19:10:58.933644 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:58.933589 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2cl86\"" Apr 22 19:10:58.940814 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:58.940798 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-467kd" Apr 22 19:10:59.053232 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:59.053206 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-467kd"] Apr 22 19:10:59.056202 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:10:59.056176 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0879ed5_18cc_4265_8956_15d1b97cade2.slice/crio-46c05d73671a5601397836a127270c608334ceb2de31a9ef60ad0182390a4f6c WatchSource:0}: Error finding container 46c05d73671a5601397836a127270c608334ceb2de31a9ef60ad0182390a4f6c: Status 404 returned error can't find the container with id 46c05d73671a5601397836a127270c608334ceb2de31a9ef60ad0182390a4f6c Apr 22 19:10:59.642117 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:59.642051 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-772dj" event={"ID":"0c333e53-81f1-4b5a-91a4-6aad9cbe63aa","Type":"ContainerStarted","Data":"c0e57d74c1bc50d7747ef441cc2d4f78a3901c89ffd31ae4acc6347f211f52ff"} Apr 22 19:10:59.643675 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:10:59.643626 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-467kd" event={"ID":"f0879ed5-18cc-4265-8956-15d1b97cade2","Type":"ContainerStarted","Data":"46c05d73671a5601397836a127270c608334ceb2de31a9ef60ad0182390a4f6c"} Apr 22 19:11:01.650353 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:11:01.650316 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-467kd" event={"ID":"f0879ed5-18cc-4265-8956-15d1b97cade2","Type":"ContainerStarted","Data":"302901ab48c6c577165ac94142c50de08b46c11bd422dfcd33fe803d01fddb11"} Apr 22 19:11:01.650353 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:11:01.650357 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-467kd" event={"ID":"f0879ed5-18cc-4265-8956-15d1b97cade2","Type":"ContainerStarted","Data":"91623c08fa60b747662a9700dc1a096c98693e4a8f1d6e9ff754b913d9f4ad3d"} Apr 22 19:11:01.650861 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:11:01.650460 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-467kd" Apr 22 19:11:01.651561 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:11:01.651529 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-772dj" event={"ID":"0c333e53-81f1-4b5a-91a4-6aad9cbe63aa","Type":"ContainerStarted","Data":"af6ef1e329ebce49e340522ac0b496f1f241b7fef050b359ac78534e32aee96b"} Apr 22 19:11:01.668578 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:11:01.668522 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-467kd" podStartSLOduration=251.944098248 podStartE2EDuration="4m13.668511266s" podCreationTimestamp="2026-04-22 19:06:48 +0000 UTC" firstStartedPulling="2026-04-22 19:10:59.058080275 +0000 UTC m=+283.738558111" lastFinishedPulling="2026-04-22 19:11:00.782493292 +0000 UTC m=+285.462971129" observedRunningTime="2026-04-22 19:11:01.66716649 +0000 UTC m=+286.347644344" watchObservedRunningTime="2026-04-22 19:11:01.668511266 +0000 UTC m=+286.348989120" Apr 22 19:11:01.683952 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:11:01.683908 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-772dj" podStartSLOduration=251.723804427 podStartE2EDuration="4m13.683893016s" podCreationTimestamp="2026-04-22 19:06:48 +0000 UTC" firstStartedPulling="2026-04-22 19:10:58.821090664 +0000 UTC m=+283.501568500" lastFinishedPulling="2026-04-22 19:11:00.781179253 +0000 UTC m=+285.461657089" observedRunningTime="2026-04-22 19:11:01.682436692 +0000 UTC m=+286.362914547" watchObservedRunningTime="2026-04-22 19:11:01.683893016 +0000 UTC m=+286.364370871" Apr 22 19:11:11.658592 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:11:11.658561 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-467kd" Apr 22 19:15:08.312458 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.312374 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9"] Apr 22 19:15:08.315470 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.315452 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" Apr 22 19:15:08.318184 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.318161 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lgbk5\"" Apr 22 19:15:08.318440 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.318424 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 19:15:08.319059 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.319042 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 19:15:08.325912 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.325893 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9"] Apr 22 19:15:08.470374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.470346 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxj8h\" (UniqueName: \"kubernetes.io/projected/e1712e12-dae5-4ee4-b587-62a49e2f830e-kube-api-access-vxj8h\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9\" (UID: \"e1712e12-dae5-4ee4-b587-62a49e2f830e\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" Apr 22 19:15:08.470483 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.470387 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1712e12-dae5-4ee4-b587-62a49e2f830e-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9\" (UID: \"e1712e12-dae5-4ee4-b587-62a49e2f830e\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" Apr 22 19:15:08.470483 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.470413 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1712e12-dae5-4ee4-b587-62a49e2f830e-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9\" (UID: \"e1712e12-dae5-4ee4-b587-62a49e2f830e\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" Apr 22 19:15:08.571624 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.571568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxj8h\" (UniqueName: \"kubernetes.io/projected/e1712e12-dae5-4ee4-b587-62a49e2f830e-kube-api-access-vxj8h\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9\" (UID: \"e1712e12-dae5-4ee4-b587-62a49e2f830e\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" Apr 22 19:15:08.571624 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.571614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1712e12-dae5-4ee4-b587-62a49e2f830e-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9\" (UID: \"e1712e12-dae5-4ee4-b587-62a49e2f830e\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" Apr 22 19:15:08.571757 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.571637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1712e12-dae5-4ee4-b587-62a49e2f830e-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9\" (UID: \"e1712e12-dae5-4ee4-b587-62a49e2f830e\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" Apr 22 19:15:08.571926 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.571912 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1712e12-dae5-4ee4-b587-62a49e2f830e-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9\" (UID: \"e1712e12-dae5-4ee4-b587-62a49e2f830e\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" Apr 22 19:15:08.572004 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.571987 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1712e12-dae5-4ee4-b587-62a49e2f830e-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9\" (UID: \"e1712e12-dae5-4ee4-b587-62a49e2f830e\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" Apr 22 19:15:08.580941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.580916 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxj8h\" (UniqueName: \"kubernetes.io/projected/e1712e12-dae5-4ee4-b587-62a49e2f830e-kube-api-access-vxj8h\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9\" (UID: \"e1712e12-dae5-4ee4-b587-62a49e2f830e\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" Apr 22 19:15:08.624896 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.624875 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" Apr 22 19:15:08.737584 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.737560 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9"] Apr 22 19:15:08.739115 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:15:08.739079 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1712e12_dae5_4ee4_b587_62a49e2f830e.slice/crio-2c1a88468547620763bd684011c3bd78dfe43c7c2d0fb6837de08fb6222b42d7 WatchSource:0}: Error finding container 2c1a88468547620763bd684011c3bd78dfe43c7c2d0fb6837de08fb6222b42d7: Status 404 returned error can't find the container with id 2c1a88468547620763bd684011c3bd78dfe43c7c2d0fb6837de08fb6222b42d7 Apr 22 19:15:08.741208 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:08.741193 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:15:09.268123 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:09.268093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" event={"ID":"e1712e12-dae5-4ee4-b587-62a49e2f830e","Type":"ContainerStarted","Data":"2c1a88468547620763bd684011c3bd78dfe43c7c2d0fb6837de08fb6222b42d7"} Apr 22 19:15:14.283158 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:14.283122 2576 generic.go:358] "Generic (PLEG): container finished" podID="e1712e12-dae5-4ee4-b587-62a49e2f830e" containerID="823e2933816d4ea7ae792956fef47f03509d5cf72eaac5ca51fba6404357ad66" exitCode=0 Apr 22 19:15:14.283621 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:14.283171 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" event={"ID":"e1712e12-dae5-4ee4-b587-62a49e2f830e","Type":"ContainerDied","Data":"823e2933816d4ea7ae792956fef47f03509d5cf72eaac5ca51fba6404357ad66"} Apr 22 19:15:16.290403 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:16.290374 2576 generic.go:358] "Generic (PLEG): container finished" podID="e1712e12-dae5-4ee4-b587-62a49e2f830e" containerID="ae0ac5795c0d21b7573916cc28ca296a71971ed88a1a0a0945b66251e5d5e134" exitCode=0 Apr 22 19:15:16.290802 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:16.290420 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" event={"ID":"e1712e12-dae5-4ee4-b587-62a49e2f830e","Type":"ContainerDied","Data":"ae0ac5795c0d21b7573916cc28ca296a71971ed88a1a0a0945b66251e5d5e134"} Apr 22 19:15:22.309893 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:22.309859 2576 generic.go:358] "Generic (PLEG): container finished" podID="e1712e12-dae5-4ee4-b587-62a49e2f830e" containerID="7d1b3d3a0b589b5d843b72fd4d7848d712bb910b6267a050a609d614a52331c6" exitCode=0 Apr 22 19:15:22.310228 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:22.309905 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" event={"ID":"e1712e12-dae5-4ee4-b587-62a49e2f830e","Type":"ContainerDied","Data":"7d1b3d3a0b589b5d843b72fd4d7848d712bb910b6267a050a609d614a52331c6"} Apr 22 19:15:23.425415 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:23.425394 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" Apr 22 19:15:23.583395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:23.583306 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1712e12-dae5-4ee4-b587-62a49e2f830e-bundle\") pod \"e1712e12-dae5-4ee4-b587-62a49e2f830e\" (UID: \"e1712e12-dae5-4ee4-b587-62a49e2f830e\") " Apr 22 19:15:23.583395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:23.583349 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1712e12-dae5-4ee4-b587-62a49e2f830e-util\") pod \"e1712e12-dae5-4ee4-b587-62a49e2f830e\" (UID: \"e1712e12-dae5-4ee4-b587-62a49e2f830e\") " Apr 22 19:15:23.583633 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:23.583438 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxj8h\" (UniqueName: \"kubernetes.io/projected/e1712e12-dae5-4ee4-b587-62a49e2f830e-kube-api-access-vxj8h\") pod \"e1712e12-dae5-4ee4-b587-62a49e2f830e\" (UID: \"e1712e12-dae5-4ee4-b587-62a49e2f830e\") " Apr 22 19:15:23.583945 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:23.583915 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1712e12-dae5-4ee4-b587-62a49e2f830e-bundle" (OuterVolumeSpecName: "bundle") pod "e1712e12-dae5-4ee4-b587-62a49e2f830e" (UID: "e1712e12-dae5-4ee4-b587-62a49e2f830e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:15:23.585588 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:23.585541 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1712e12-dae5-4ee4-b587-62a49e2f830e-kube-api-access-vxj8h" (OuterVolumeSpecName: "kube-api-access-vxj8h") pod "e1712e12-dae5-4ee4-b587-62a49e2f830e" (UID: "e1712e12-dae5-4ee4-b587-62a49e2f830e"). InnerVolumeSpecName "kube-api-access-vxj8h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:15:23.587595 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:23.587573 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1712e12-dae5-4ee4-b587-62a49e2f830e-util" (OuterVolumeSpecName: "util") pod "e1712e12-dae5-4ee4-b587-62a49e2f830e" (UID: "e1712e12-dae5-4ee4-b587-62a49e2f830e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:15:23.684747 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:23.684716 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vxj8h\" (UniqueName: \"kubernetes.io/projected/e1712e12-dae5-4ee4-b587-62a49e2f830e-kube-api-access-vxj8h\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:15:23.684747 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:23.684744 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1712e12-dae5-4ee4-b587-62a49e2f830e-bundle\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:15:23.684878 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:23.684757 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1712e12-dae5-4ee4-b587-62a49e2f830e-util\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:15:24.316316 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:24.316286 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" event={"ID":"e1712e12-dae5-4ee4-b587-62a49e2f830e","Type":"ContainerDied","Data":"2c1a88468547620763bd684011c3bd78dfe43c7c2d0fb6837de08fb6222b42d7"} Apr 22 19:15:24.316316 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:24.316313 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkh8r9" Apr 22 19:15:24.316489 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:24.316318 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c1a88468547620763bd684011c3bd78dfe43c7c2d0fb6837de08fb6222b42d7" Apr 22 19:15:31.352914 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.352882 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-m82g9"] Apr 22 19:15:31.353269 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.353122 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1712e12-dae5-4ee4-b587-62a49e2f830e" containerName="pull" Apr 22 19:15:31.353269 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.353132 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1712e12-dae5-4ee4-b587-62a49e2f830e" containerName="pull" Apr 22 19:15:31.353269 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.353142 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1712e12-dae5-4ee4-b587-62a49e2f830e" containerName="util" Apr 22 19:15:31.353269 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.353148 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1712e12-dae5-4ee4-b587-62a49e2f830e" containerName="util" Apr 22 19:15:31.353269 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.353160 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1712e12-dae5-4ee4-b587-62a49e2f830e" containerName="extract" Apr 22 19:15:31.353269 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.353166 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1712e12-dae5-4ee4-b587-62a49e2f830e" containerName="extract" Apr 22 19:15:31.353269 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.353205 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1712e12-dae5-4ee4-b587-62a49e2f830e" containerName="extract" Apr 22 19:15:31.390593 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.390565 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-m82g9"] Apr 22 19:15:31.390692 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.390668 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-m82g9" Apr 22 19:15:31.393761 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.393736 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 22 19:15:31.393859 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.393777 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:15:31.394060 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.394044 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-qszwm\"" Apr 22 19:15:31.537251 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.537218 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e9aa6cb-d7ee-4052-8e31-fd222c5b666c-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-m82g9\" (UID: \"5e9aa6cb-d7ee-4052-8e31-fd222c5b666c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-m82g9" Apr 22 19:15:31.537406 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.537277 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw4l6\" (UniqueName: \"kubernetes.io/projected/5e9aa6cb-d7ee-4052-8e31-fd222c5b666c-kube-api-access-cw4l6\") pod \"cert-manager-operator-controller-manager-54b9655956-m82g9\" (UID: \"5e9aa6cb-d7ee-4052-8e31-fd222c5b666c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-m82g9" Apr 22 19:15:31.637927 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.637860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cw4l6\" (UniqueName: \"kubernetes.io/projected/5e9aa6cb-d7ee-4052-8e31-fd222c5b666c-kube-api-access-cw4l6\") pod \"cert-manager-operator-controller-manager-54b9655956-m82g9\" (UID: \"5e9aa6cb-d7ee-4052-8e31-fd222c5b666c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-m82g9" Apr 22 19:15:31.637927 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.637903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e9aa6cb-d7ee-4052-8e31-fd222c5b666c-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-m82g9\" (UID: \"5e9aa6cb-d7ee-4052-8e31-fd222c5b666c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-m82g9" Apr 22 19:15:31.638204 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.638186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e9aa6cb-d7ee-4052-8e31-fd222c5b666c-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-m82g9\" (UID: \"5e9aa6cb-d7ee-4052-8e31-fd222c5b666c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-m82g9" Apr 22 19:15:31.646514 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.646486 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw4l6\" (UniqueName: \"kubernetes.io/projected/5e9aa6cb-d7ee-4052-8e31-fd222c5b666c-kube-api-access-cw4l6\") pod \"cert-manager-operator-controller-manager-54b9655956-m82g9\" (UID: \"5e9aa6cb-d7ee-4052-8e31-fd222c5b666c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-m82g9" Apr 22 19:15:31.699476 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.699454 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-m82g9" Apr 22 19:15:31.818256 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:31.818176 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-m82g9"] Apr 22 19:15:31.820615 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:15:31.820588 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e9aa6cb_d7ee_4052_8e31_fd222c5b666c.slice/crio-2df2d6c96bda031971e9d8c494debcba95276c05b3b8ccc8e95d6ee6640fe37b WatchSource:0}: Error finding container 2df2d6c96bda031971e9d8c494debcba95276c05b3b8ccc8e95d6ee6640fe37b: Status 404 returned error can't find the container with id 2df2d6c96bda031971e9d8c494debcba95276c05b3b8ccc8e95d6ee6640fe37b Apr 22 19:15:32.336846 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:32.336810 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-m82g9" event={"ID":"5e9aa6cb-d7ee-4052-8e31-fd222c5b666c","Type":"ContainerStarted","Data":"2df2d6c96bda031971e9d8c494debcba95276c05b3b8ccc8e95d6ee6640fe37b"} Apr 22 19:15:34.347110 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:34.347078 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-m82g9" event={"ID":"5e9aa6cb-d7ee-4052-8e31-fd222c5b666c","Type":"ContainerStarted","Data":"83df548d2f6e6707190186a2772757aaa0ce4cc62b42427bee14da727e0ae5e3"} Apr 22 19:15:34.375590 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:34.375529 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-m82g9" podStartSLOduration=1.252330847 podStartE2EDuration="3.375517294s" podCreationTimestamp="2026-04-22 19:15:31 +0000 UTC" firstStartedPulling="2026-04-22 19:15:31.823040511 +0000 UTC m=+556.503518343" lastFinishedPulling="2026-04-22 19:15:33.946226953 +0000 UTC m=+558.626704790" observedRunningTime="2026-04-22 19:15:34.37404765 +0000 UTC m=+559.054525499" watchObservedRunningTime="2026-04-22 19:15:34.375517294 +0000 UTC m=+559.055995149" Apr 22 19:15:35.603616 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:35.603577 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l"] Apr 22 19:15:35.622034 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:35.622010 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l"] Apr 22 19:15:35.622181 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:35.622135 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" Apr 22 19:15:35.625956 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:35.625918 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 19:15:35.626913 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:35.626895 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lgbk5\"" Apr 22 19:15:35.627021 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:35.626956 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 19:15:35.670895 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:35.670872 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/720442bf-3fac-4c08-bcd8-66d96e1ba762-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l\" (UID: \"720442bf-3fac-4c08-bcd8-66d96e1ba762\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" Apr 22 19:15:35.671000 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:35.670906 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/720442bf-3fac-4c08-bcd8-66d96e1ba762-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l\" (UID: \"720442bf-3fac-4c08-bcd8-66d96e1ba762\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" Apr 22 19:15:35.671000 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:35.670928 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss4jt\" (UniqueName: \"kubernetes.io/projected/720442bf-3fac-4c08-bcd8-66d96e1ba762-kube-api-access-ss4jt\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l\" (UID: \"720442bf-3fac-4c08-bcd8-66d96e1ba762\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" Apr 22 19:15:35.771803 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:35.771774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/720442bf-3fac-4c08-bcd8-66d96e1ba762-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l\" (UID: \"720442bf-3fac-4c08-bcd8-66d96e1ba762\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" Apr 22 19:15:35.771914 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:35.771827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ss4jt\" (UniqueName: \"kubernetes.io/projected/720442bf-3fac-4c08-bcd8-66d96e1ba762-kube-api-access-ss4jt\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l\" (UID: \"720442bf-3fac-4c08-bcd8-66d96e1ba762\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" Apr 22 19:15:35.771914 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:35.771894 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/720442bf-3fac-4c08-bcd8-66d96e1ba762-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l\" (UID: \"720442bf-3fac-4c08-bcd8-66d96e1ba762\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" Apr 22 19:15:35.772135 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:35.772116 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/720442bf-3fac-4c08-bcd8-66d96e1ba762-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l\" (UID: \"720442bf-3fac-4c08-bcd8-66d96e1ba762\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" Apr 22 19:15:35.772217 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:35.772197 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/720442bf-3fac-4c08-bcd8-66d96e1ba762-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l\" (UID: \"720442bf-3fac-4c08-bcd8-66d96e1ba762\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" Apr 22 19:15:35.800506 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:35.800477 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss4jt\" (UniqueName: \"kubernetes.io/projected/720442bf-3fac-4c08-bcd8-66d96e1ba762-kube-api-access-ss4jt\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l\" (UID: \"720442bf-3fac-4c08-bcd8-66d96e1ba762\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" Apr 22 19:15:35.932197 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:35.932130 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" Apr 22 19:15:36.053844 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:36.053813 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l"] Apr 22 19:15:36.056605 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:15:36.056579 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod720442bf_3fac_4c08_bcd8_66d96e1ba762.slice/crio-e9a8fceb8092b3a27e62c65747cf0984ca8e222c6a7beb3dee39fecc0e02f741 WatchSource:0}: Error finding container e9a8fceb8092b3a27e62c65747cf0984ca8e222c6a7beb3dee39fecc0e02f741: Status 404 returned error can't find the container with id e9a8fceb8092b3a27e62c65747cf0984ca8e222c6a7beb3dee39fecc0e02f741 Apr 22 19:15:36.354126 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:36.354090 2576 generic.go:358] "Generic (PLEG): container finished" podID="720442bf-3fac-4c08-bcd8-66d96e1ba762" containerID="bd4ed9334c22def6c014fc0e592ee5606fa9dd5185f1e9353ac5d01eb8818481" exitCode=0 Apr 22 19:15:36.354263 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:36.354131 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" event={"ID":"720442bf-3fac-4c08-bcd8-66d96e1ba762","Type":"ContainerDied","Data":"bd4ed9334c22def6c014fc0e592ee5606fa9dd5185f1e9353ac5d01eb8818481"} Apr 22 19:15:36.354263 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:36.354154 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" event={"ID":"720442bf-3fac-4c08-bcd8-66d96e1ba762","Type":"ContainerStarted","Data":"e9a8fceb8092b3a27e62c65747cf0984ca8e222c6a7beb3dee39fecc0e02f741"} Apr 22 19:15:39.365940 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:39.365900 2576 generic.go:358] "Generic (PLEG): container finished" podID="720442bf-3fac-4c08-bcd8-66d96e1ba762" containerID="e7727005d461a0d9c6e650867041f89060e27d24aca39349feb69495ce2af90e" exitCode=0 Apr 22 19:15:39.366357 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:39.365944 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" event={"ID":"720442bf-3fac-4c08-bcd8-66d96e1ba762","Type":"ContainerDied","Data":"e7727005d461a0d9c6e650867041f89060e27d24aca39349feb69495ce2af90e"} Apr 22 19:15:40.371237 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:40.371204 2576 generic.go:358] "Generic (PLEG): container finished" podID="720442bf-3fac-4c08-bcd8-66d96e1ba762" containerID="4bd0d4843ecb19157a7a5068075fc48304d64ba1853ea27758fd5907a969248a" exitCode=0 Apr 22 19:15:40.371708 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:40.371283 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" event={"ID":"720442bf-3fac-4c08-bcd8-66d96e1ba762","Type":"ContainerDied","Data":"4bd0d4843ecb19157a7a5068075fc48304d64ba1853ea27758fd5907a969248a"} Apr 22 19:15:41.489880 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:41.489857 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" Apr 22 19:15:41.516484 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:41.516459 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss4jt\" (UniqueName: \"kubernetes.io/projected/720442bf-3fac-4c08-bcd8-66d96e1ba762-kube-api-access-ss4jt\") pod \"720442bf-3fac-4c08-bcd8-66d96e1ba762\" (UID: \"720442bf-3fac-4c08-bcd8-66d96e1ba762\") " Apr 22 19:15:41.516652 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:41.516499 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/720442bf-3fac-4c08-bcd8-66d96e1ba762-bundle\") pod \"720442bf-3fac-4c08-bcd8-66d96e1ba762\" (UID: \"720442bf-3fac-4c08-bcd8-66d96e1ba762\") " Apr 22 19:15:41.516652 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:41.516521 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/720442bf-3fac-4c08-bcd8-66d96e1ba762-util\") pod \"720442bf-3fac-4c08-bcd8-66d96e1ba762\" (UID: \"720442bf-3fac-4c08-bcd8-66d96e1ba762\") " Apr 22 19:15:41.516909 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:41.516882 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/720442bf-3fac-4c08-bcd8-66d96e1ba762-bundle" (OuterVolumeSpecName: "bundle") pod "720442bf-3fac-4c08-bcd8-66d96e1ba762" (UID: "720442bf-3fac-4c08-bcd8-66d96e1ba762"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:15:41.518726 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:41.518704 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720442bf-3fac-4c08-bcd8-66d96e1ba762-kube-api-access-ss4jt" (OuterVolumeSpecName: "kube-api-access-ss4jt") pod "720442bf-3fac-4c08-bcd8-66d96e1ba762" (UID: "720442bf-3fac-4c08-bcd8-66d96e1ba762"). InnerVolumeSpecName "kube-api-access-ss4jt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:15:41.524157 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:41.524128 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/720442bf-3fac-4c08-bcd8-66d96e1ba762-util" (OuterVolumeSpecName: "util") pod "720442bf-3fac-4c08-bcd8-66d96e1ba762" (UID: "720442bf-3fac-4c08-bcd8-66d96e1ba762"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:15:41.617238 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:41.617203 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ss4jt\" (UniqueName: \"kubernetes.io/projected/720442bf-3fac-4c08-bcd8-66d96e1ba762-kube-api-access-ss4jt\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:15:41.617238 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:41.617234 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/720442bf-3fac-4c08-bcd8-66d96e1ba762-bundle\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:15:41.617238 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:41.617246 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/720442bf-3fac-4c08-bcd8-66d96e1ba762-util\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:15:42.378281 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:42.378252 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" Apr 22 19:15:42.378447 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:42.378250 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fblk2l" event={"ID":"720442bf-3fac-4c08-bcd8-66d96e1ba762","Type":"ContainerDied","Data":"e9a8fceb8092b3a27e62c65747cf0984ca8e222c6a7beb3dee39fecc0e02f741"} Apr 22 19:15:42.378447 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:42.378372 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9a8fceb8092b3a27e62c65747cf0984ca8e222c6a7beb3dee39fecc0e02f741" Apr 22 19:15:48.254561 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.254507 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-7bx5r"] Apr 22 19:15:48.254929 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.254768 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="720442bf-3fac-4c08-bcd8-66d96e1ba762" containerName="util" Apr 22 19:15:48.254929 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.254778 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="720442bf-3fac-4c08-bcd8-66d96e1ba762" containerName="util" Apr 22 19:15:48.254929 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.254785 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="720442bf-3fac-4c08-bcd8-66d96e1ba762" containerName="pull" Apr 22 19:15:48.254929 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.254790 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="720442bf-3fac-4c08-bcd8-66d96e1ba762" containerName="pull" Apr 22 19:15:48.254929 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.254799 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="720442bf-3fac-4c08-bcd8-66d96e1ba762" containerName="extract" Apr 22 19:15:48.254929 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.254804 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="720442bf-3fac-4c08-bcd8-66d96e1ba762" containerName="extract" Apr 22 19:15:48.254929 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.254854 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="720442bf-3fac-4c08-bcd8-66d96e1ba762" containerName="extract" Apr 22 19:15:48.259473 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.259456 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7bx5r" Apr 22 19:15:48.263525 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.263502 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-lh8g8\"" Apr 22 19:15:48.263669 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.263529 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:15:48.264357 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.264338 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 19:15:48.266692 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.266665 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-7bx5r"] Apr 22 19:15:48.361411 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.361379 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwh6p\" (UniqueName: \"kubernetes.io/projected/db1ba053-7c0b-4491-92b5-535da3303c77-kube-api-access-pwh6p\") pod \"openshift-lws-operator-bfc7f696d-7bx5r\" (UID: \"db1ba053-7c0b-4491-92b5-535da3303c77\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7bx5r" Apr 22 19:15:48.361598 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.361418 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/db1ba053-7c0b-4491-92b5-535da3303c77-tmp\") pod \"openshift-lws-operator-bfc7f696d-7bx5r\" (UID: \"db1ba053-7c0b-4491-92b5-535da3303c77\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7bx5r" Apr 22 19:15:48.462639 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.462604 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwh6p\" (UniqueName: \"kubernetes.io/projected/db1ba053-7c0b-4491-92b5-535da3303c77-kube-api-access-pwh6p\") pod \"openshift-lws-operator-bfc7f696d-7bx5r\" (UID: \"db1ba053-7c0b-4491-92b5-535da3303c77\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7bx5r" Apr 22 19:15:48.462762 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.462648 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/db1ba053-7c0b-4491-92b5-535da3303c77-tmp\") pod \"openshift-lws-operator-bfc7f696d-7bx5r\" (UID: \"db1ba053-7c0b-4491-92b5-535da3303c77\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7bx5r" Apr 22 19:15:48.463083 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.463066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/db1ba053-7c0b-4491-92b5-535da3303c77-tmp\") pod \"openshift-lws-operator-bfc7f696d-7bx5r\" (UID: \"db1ba053-7c0b-4491-92b5-535da3303c77\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7bx5r" Apr 22 19:15:48.472367 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.472342 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwh6p\" (UniqueName: \"kubernetes.io/projected/db1ba053-7c0b-4491-92b5-535da3303c77-kube-api-access-pwh6p\") pod \"openshift-lws-operator-bfc7f696d-7bx5r\" (UID: \"db1ba053-7c0b-4491-92b5-535da3303c77\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7bx5r" Apr 22 19:15:48.568889 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.568814 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7bx5r" Apr 22 19:15:48.692665 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:48.692642 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-7bx5r"] Apr 22 19:15:48.694771 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:15:48.694746 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb1ba053_7c0b_4491_92b5_535da3303c77.slice/crio-5ffe233be6bf6d25164d0713a3dec250edb66b9dcc011b69dd5e4eb564c25526 WatchSource:0}: Error finding container 5ffe233be6bf6d25164d0713a3dec250edb66b9dcc011b69dd5e4eb564c25526: Status 404 returned error can't find the container with id 5ffe233be6bf6d25164d0713a3dec250edb66b9dcc011b69dd5e4eb564c25526 Apr 22 19:15:49.401161 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:49.401123 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7bx5r" event={"ID":"db1ba053-7c0b-4491-92b5-535da3303c77","Type":"ContainerStarted","Data":"5ffe233be6bf6d25164d0713a3dec250edb66b9dcc011b69dd5e4eb564c25526"} Apr 22 19:15:50.405786 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:50.405755 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7bx5r" event={"ID":"db1ba053-7c0b-4491-92b5-535da3303c77","Type":"ContainerStarted","Data":"8e1cd3012d9a2eb07f798eb9ff81e6085af09509e54b8818f6057623fe0ede7a"} Apr 22 19:15:50.422715 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:50.422667 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7bx5r" podStartSLOduration=0.837512429 podStartE2EDuration="2.422653043s" podCreationTimestamp="2026-04-22 19:15:48 +0000 UTC" firstStartedPulling="2026-04-22 19:15:48.696230807 +0000 UTC m=+573.376708641" lastFinishedPulling="2026-04-22 19:15:50.281371422 +0000 UTC m=+574.961849255" observedRunningTime="2026-04-22 19:15:50.421283084 +0000 UTC m=+575.101760939" watchObservedRunningTime="2026-04-22 19:15:50.422653043 +0000 UTC m=+575.103130898" Apr 22 19:15:53.183655 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:53.183594 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq"] Apr 22 19:15:53.186967 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:53.186951 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" Apr 22 19:15:53.189713 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:53.189683 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 19:15:53.190669 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:53.190646 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 19:15:53.190762 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:53.190651 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lgbk5\"" Apr 22 19:15:53.196323 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:53.196304 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq"] Apr 22 19:15:53.293092 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:53.293066 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef42a5af-168b-4309-a8b5-88b263ddc9fb-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq\" (UID: \"ef42a5af-168b-4309-a8b5-88b263ddc9fb\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" Apr 22 19:15:53.293186 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:53.293100 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjbxm\" (UniqueName: \"kubernetes.io/projected/ef42a5af-168b-4309-a8b5-88b263ddc9fb-kube-api-access-cjbxm\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq\" (UID: \"ef42a5af-168b-4309-a8b5-88b263ddc9fb\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" Apr 22 19:15:53.293186 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:53.293130 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef42a5af-168b-4309-a8b5-88b263ddc9fb-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq\" (UID: \"ef42a5af-168b-4309-a8b5-88b263ddc9fb\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" Apr 22 19:15:53.394333 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:53.394306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef42a5af-168b-4309-a8b5-88b263ddc9fb-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq\" (UID: \"ef42a5af-168b-4309-a8b5-88b263ddc9fb\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" Apr 22 19:15:53.394442 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:53.394337 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjbxm\" (UniqueName: \"kubernetes.io/projected/ef42a5af-168b-4309-a8b5-88b263ddc9fb-kube-api-access-cjbxm\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq\" (UID: \"ef42a5af-168b-4309-a8b5-88b263ddc9fb\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" Apr 22 19:15:53.394442 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:53.394356 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef42a5af-168b-4309-a8b5-88b263ddc9fb-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq\" (UID: \"ef42a5af-168b-4309-a8b5-88b263ddc9fb\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" Apr 22 19:15:53.394668 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:53.394652 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef42a5af-168b-4309-a8b5-88b263ddc9fb-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq\" (UID: \"ef42a5af-168b-4309-a8b5-88b263ddc9fb\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" Apr 22 19:15:53.394759 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:53.394739 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef42a5af-168b-4309-a8b5-88b263ddc9fb-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq\" (UID: \"ef42a5af-168b-4309-a8b5-88b263ddc9fb\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" Apr 22 19:15:53.404959 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:53.404942 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjbxm\" (UniqueName: \"kubernetes.io/projected/ef42a5af-168b-4309-a8b5-88b263ddc9fb-kube-api-access-cjbxm\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq\" (UID: \"ef42a5af-168b-4309-a8b5-88b263ddc9fb\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" Apr 22 19:15:53.496013 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:53.495983 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" Apr 22 19:15:53.615376 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:53.615353 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq"] Apr 22 19:15:53.617652 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:15:53.617625 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef42a5af_168b_4309_a8b5_88b263ddc9fb.slice/crio-67e81fc74edceb7f03b17c88e2ec192c77ad447190ade853623b31c92063d335 WatchSource:0}: Error finding container 67e81fc74edceb7f03b17c88e2ec192c77ad447190ade853623b31c92063d335: Status 404 returned error can't find the container with id 67e81fc74edceb7f03b17c88e2ec192c77ad447190ade853623b31c92063d335 Apr 22 19:15:54.417795 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:54.417767 2576 generic.go:358] "Generic (PLEG): container finished" podID="ef42a5af-168b-4309-a8b5-88b263ddc9fb" containerID="c82595ca8960b61abb0ec1614a964ca2a42eaf9e0dfd2ce92451c2d9395146ee" exitCode=0 Apr 22 19:15:54.418090 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:54.417806 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" event={"ID":"ef42a5af-168b-4309-a8b5-88b263ddc9fb","Type":"ContainerDied","Data":"c82595ca8960b61abb0ec1614a964ca2a42eaf9e0dfd2ce92451c2d9395146ee"} Apr 22 19:15:54.418090 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:54.417827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" event={"ID":"ef42a5af-168b-4309-a8b5-88b263ddc9fb","Type":"ContainerStarted","Data":"67e81fc74edceb7f03b17c88e2ec192c77ad447190ade853623b31c92063d335"} Apr 22 19:15:55.423000 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:55.422910 2576 generic.go:358] "Generic (PLEG): container finished" podID="ef42a5af-168b-4309-a8b5-88b263ddc9fb" containerID="379cf8268aa859d82c7f023754bf548825da61971cd7958e6cdbbba8844a3956" exitCode=0 Apr 22 19:15:55.423000 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:55.422976 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" event={"ID":"ef42a5af-168b-4309-a8b5-88b263ddc9fb","Type":"ContainerDied","Data":"379cf8268aa859d82c7f023754bf548825da61971cd7958e6cdbbba8844a3956"} Apr 22 19:15:56.427973 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:56.427941 2576 generic.go:358] "Generic (PLEG): container finished" podID="ef42a5af-168b-4309-a8b5-88b263ddc9fb" containerID="c58d448641b6c41b002e01a56caf20e1279925090c98d85e27c152e8fdcaa698" exitCode=0 Apr 22 19:15:56.428332 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:56.427985 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" event={"ID":"ef42a5af-168b-4309-a8b5-88b263ddc9fb","Type":"ContainerDied","Data":"c58d448641b6c41b002e01a56caf20e1279925090c98d85e27c152e8fdcaa698"} Apr 22 19:15:57.545949 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:57.545929 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" Apr 22 19:15:57.626191 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:57.626170 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef42a5af-168b-4309-a8b5-88b263ddc9fb-util\") pod \"ef42a5af-168b-4309-a8b5-88b263ddc9fb\" (UID: \"ef42a5af-168b-4309-a8b5-88b263ddc9fb\") " Apr 22 19:15:57.626306 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:57.626202 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjbxm\" (UniqueName: \"kubernetes.io/projected/ef42a5af-168b-4309-a8b5-88b263ddc9fb-kube-api-access-cjbxm\") pod \"ef42a5af-168b-4309-a8b5-88b263ddc9fb\" (UID: \"ef42a5af-168b-4309-a8b5-88b263ddc9fb\") " Apr 22 19:15:57.626306 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:57.626219 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef42a5af-168b-4309-a8b5-88b263ddc9fb-bundle\") pod \"ef42a5af-168b-4309-a8b5-88b263ddc9fb\" (UID: \"ef42a5af-168b-4309-a8b5-88b263ddc9fb\") " Apr 22 19:15:57.626927 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:57.626902 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef42a5af-168b-4309-a8b5-88b263ddc9fb-bundle" (OuterVolumeSpecName: "bundle") pod "ef42a5af-168b-4309-a8b5-88b263ddc9fb" (UID: "ef42a5af-168b-4309-a8b5-88b263ddc9fb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:15:57.628119 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:57.628090 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef42a5af-168b-4309-a8b5-88b263ddc9fb-kube-api-access-cjbxm" (OuterVolumeSpecName: "kube-api-access-cjbxm") pod "ef42a5af-168b-4309-a8b5-88b263ddc9fb" (UID: "ef42a5af-168b-4309-a8b5-88b263ddc9fb"). InnerVolumeSpecName "kube-api-access-cjbxm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:15:57.631166 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:57.631123 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef42a5af-168b-4309-a8b5-88b263ddc9fb-util" (OuterVolumeSpecName: "util") pod "ef42a5af-168b-4309-a8b5-88b263ddc9fb" (UID: "ef42a5af-168b-4309-a8b5-88b263ddc9fb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:15:57.726793 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:57.726738 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef42a5af-168b-4309-a8b5-88b263ddc9fb-util\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:15:57.726793 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:57.726759 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cjbxm\" (UniqueName: \"kubernetes.io/projected/ef42a5af-168b-4309-a8b5-88b263ddc9fb-kube-api-access-cjbxm\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:15:57.726793 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:57.726770 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef42a5af-168b-4309-a8b5-88b263ddc9fb-bundle\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:15:58.435163 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:58.435121 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" event={"ID":"ef42a5af-168b-4309-a8b5-88b263ddc9fb","Type":"ContainerDied","Data":"67e81fc74edceb7f03b17c88e2ec192c77ad447190ade853623b31c92063d335"} Apr 22 19:15:58.435163 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:58.435163 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67e81fc74edceb7f03b17c88e2ec192c77ad447190ade853623b31c92063d335" Apr 22 19:15:58.435351 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:15:58.435199 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58m5wq" Apr 22 19:16:04.931146 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:04.931118 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt"] Apr 22 19:16:04.931488 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:04.931347 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef42a5af-168b-4309-a8b5-88b263ddc9fb" containerName="pull" Apr 22 19:16:04.931488 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:04.931356 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef42a5af-168b-4309-a8b5-88b263ddc9fb" containerName="pull" Apr 22 19:16:04.931488 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:04.931366 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef42a5af-168b-4309-a8b5-88b263ddc9fb" containerName="extract" Apr 22 19:16:04.931488 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:04.931372 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef42a5af-168b-4309-a8b5-88b263ddc9fb" containerName="extract" Apr 22 19:16:04.931488 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:04.931385 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef42a5af-168b-4309-a8b5-88b263ddc9fb" containerName="util" Apr 22 19:16:04.931488 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:04.931390 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef42a5af-168b-4309-a8b5-88b263ddc9fb" containerName="util" Apr 22 19:16:04.931488 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:04.931429 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef42a5af-168b-4309-a8b5-88b263ddc9fb" containerName="extract" Apr 22 19:16:04.935447 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:04.935428 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" Apr 22 19:16:04.940170 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:04.940148 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 19:16:04.940509 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:04.940472 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lgbk5\"" Apr 22 19:16:04.940509 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:04.940477 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 19:16:04.952477 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:04.952452 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt"] Apr 22 19:16:04.971158 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:04.971136 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82617e7d-3a0c-415f-929f-cd0300062d31-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt\" (UID: \"82617e7d-3a0c-415f-929f-cd0300062d31\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" Apr 22 19:16:04.971268 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:04.971178 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82617e7d-3a0c-415f-929f-cd0300062d31-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt\" (UID: \"82617e7d-3a0c-415f-929f-cd0300062d31\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" Apr 22 19:16:04.971268 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:04.971201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb2pr\" (UniqueName: \"kubernetes.io/projected/82617e7d-3a0c-415f-929f-cd0300062d31-kube-api-access-qb2pr\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt\" (UID: \"82617e7d-3a0c-415f-929f-cd0300062d31\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" Apr 22 19:16:05.072321 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:05.072294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qb2pr\" (UniqueName: \"kubernetes.io/projected/82617e7d-3a0c-415f-929f-cd0300062d31-kube-api-access-qb2pr\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt\" (UID: \"82617e7d-3a0c-415f-929f-cd0300062d31\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" Apr 22 19:16:05.072414 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:05.072342 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82617e7d-3a0c-415f-929f-cd0300062d31-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt\" (UID: \"82617e7d-3a0c-415f-929f-cd0300062d31\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" Apr 22 19:16:05.072414 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:05.072378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82617e7d-3a0c-415f-929f-cd0300062d31-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt\" (UID: \"82617e7d-3a0c-415f-929f-cd0300062d31\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" Apr 22 19:16:05.072766 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:05.072748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82617e7d-3a0c-415f-929f-cd0300062d31-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt\" (UID: \"82617e7d-3a0c-415f-929f-cd0300062d31\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" Apr 22 19:16:05.072809 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:05.072762 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82617e7d-3a0c-415f-929f-cd0300062d31-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt\" (UID: \"82617e7d-3a0c-415f-929f-cd0300062d31\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" Apr 22 19:16:05.083821 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:05.083798 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb2pr\" (UniqueName: \"kubernetes.io/projected/82617e7d-3a0c-415f-929f-cd0300062d31-kube-api-access-qb2pr\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt\" (UID: \"82617e7d-3a0c-415f-929f-cd0300062d31\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" Apr 22 19:16:05.262594 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:05.262570 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" Apr 22 19:16:05.387855 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:05.387829 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt"] Apr 22 19:16:05.389432 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:16:05.389401 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82617e7d_3a0c_415f_929f_cd0300062d31.slice/crio-71604459ec292285118f7dca7755b11cab6171bed4ee82472c3ab327ccc0377b WatchSource:0}: Error finding container 71604459ec292285118f7dca7755b11cab6171bed4ee82472c3ab327ccc0377b: Status 404 returned error can't find the container with id 71604459ec292285118f7dca7755b11cab6171bed4ee82472c3ab327ccc0377b Apr 22 19:16:05.456676 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:05.456647 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" event={"ID":"82617e7d-3a0c-415f-929f-cd0300062d31","Type":"ContainerStarted","Data":"89558495b1aa16d41bf870a72e8059308eea9318047da476ff5cae12fcc1abbf"} Apr 22 19:16:05.456804 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:05.456681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" event={"ID":"82617e7d-3a0c-415f-929f-cd0300062d31","Type":"ContainerStarted","Data":"71604459ec292285118f7dca7755b11cab6171bed4ee82472c3ab327ccc0377b"} Apr 22 19:16:06.461522 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.461445 2576 generic.go:358] "Generic (PLEG): container finished" podID="82617e7d-3a0c-415f-929f-cd0300062d31" containerID="89558495b1aa16d41bf870a72e8059308eea9318047da476ff5cae12fcc1abbf" exitCode=0 Apr 22 19:16:06.461860 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.461531 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" event={"ID":"82617e7d-3a0c-415f-929f-cd0300062d31","Type":"ContainerDied","Data":"89558495b1aa16d41bf870a72e8059308eea9318047da476ff5cae12fcc1abbf"} Apr 22 19:16:06.776808 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.776776 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6"] Apr 22 19:16:06.779919 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.779904 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6" Apr 22 19:16:06.784300 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.784277 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 19:16:06.784397 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.784351 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 19:16:06.784744 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.784728 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 19:16:06.784744 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.784739 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 19:16:06.786789 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.786773 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-dvr7t\"" Apr 22 19:16:06.887261 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.887239 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m7hk\" (UniqueName: \"kubernetes.io/projected/b9b5dba6-27d1-4353-968a-0049be88faea-kube-api-access-9m7hk\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wqwc6\" (UID: \"b9b5dba6-27d1-4353-968a-0049be88faea\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6" Apr 22 19:16:06.887360 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.887319 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9b5dba6-27d1-4353-968a-0049be88faea-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wqwc6\" (UID: \"b9b5dba6-27d1-4353-968a-0049be88faea\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6" Apr 22 19:16:06.887360 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.887351 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9b5dba6-27d1-4353-968a-0049be88faea-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wqwc6\" (UID: \"b9b5dba6-27d1-4353-968a-0049be88faea\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6" Apr 22 19:16:06.887749 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.887731 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6"] Apr 22 19:16:06.987636 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.987615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9b5dba6-27d1-4353-968a-0049be88faea-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wqwc6\" (UID: \"b9b5dba6-27d1-4353-968a-0049be88faea\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6" Apr 22 19:16:06.987735 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.987646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9b5dba6-27d1-4353-968a-0049be88faea-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wqwc6\" (UID: \"b9b5dba6-27d1-4353-968a-0049be88faea\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6" Apr 22 19:16:06.987735 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.987673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9m7hk\" (UniqueName: \"kubernetes.io/projected/b9b5dba6-27d1-4353-968a-0049be88faea-kube-api-access-9m7hk\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wqwc6\" (UID: \"b9b5dba6-27d1-4353-968a-0049be88faea\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6" Apr 22 19:16:06.990033 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.990012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9b5dba6-27d1-4353-968a-0049be88faea-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wqwc6\" (UID: \"b9b5dba6-27d1-4353-968a-0049be88faea\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6" Apr 22 19:16:06.990113 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:06.990040 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9b5dba6-27d1-4353-968a-0049be88faea-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wqwc6\" (UID: \"b9b5dba6-27d1-4353-968a-0049be88faea\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6" Apr 22 19:16:07.001822 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:07.001800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m7hk\" (UniqueName: \"kubernetes.io/projected/b9b5dba6-27d1-4353-968a-0049be88faea-kube-api-access-9m7hk\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wqwc6\" (UID: \"b9b5dba6-27d1-4353-968a-0049be88faea\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6" Apr 22 19:16:07.089586 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:07.089533 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6" Apr 22 19:16:07.214944 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:07.214922 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6"] Apr 22 19:16:07.217257 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:16:07.217225 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9b5dba6_27d1_4353_968a_0049be88faea.slice/crio-b4cd1c4a8eeed0ee2d6ef00c0f31c26df2a02d32ea8a365d9398d0d922918449 WatchSource:0}: Error finding container b4cd1c4a8eeed0ee2d6ef00c0f31c26df2a02d32ea8a365d9398d0d922918449: Status 404 returned error can't find the container with id b4cd1c4a8eeed0ee2d6ef00c0f31c26df2a02d32ea8a365d9398d0d922918449 Apr 22 19:16:07.465679 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:07.465621 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6" event={"ID":"b9b5dba6-27d1-4353-968a-0049be88faea","Type":"ContainerStarted","Data":"b4cd1c4a8eeed0ee2d6ef00c0f31c26df2a02d32ea8a365d9398d0d922918449"} Apr 22 19:16:07.467157 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:07.467132 2576 generic.go:358] "Generic (PLEG): container finished" podID="82617e7d-3a0c-415f-929f-cd0300062d31" containerID="5ffafd0cf4d6859e26590ab123ef7ad535b0770114c4a03ac3f621a235c28203" exitCode=0 Apr 22 19:16:07.467244 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:07.467201 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" event={"ID":"82617e7d-3a0c-415f-929f-cd0300062d31","Type":"ContainerDied","Data":"5ffafd0cf4d6859e26590ab123ef7ad535b0770114c4a03ac3f621a235c28203"} Apr 22 19:16:08.477943 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:08.477904 2576 generic.go:358] "Generic (PLEG): container finished" podID="82617e7d-3a0c-415f-929f-cd0300062d31" containerID="11c560bf14fe4f3b704ce7225ba9a5b28b8fc1425355b885aa886c6e12c00453" exitCode=0 Apr 22 19:16:08.478379 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:08.477979 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" event={"ID":"82617e7d-3a0c-415f-929f-cd0300062d31","Type":"ContainerDied","Data":"11c560bf14fe4f3b704ce7225ba9a5b28b8fc1425355b885aa886c6e12c00453"} Apr 22 19:16:09.647816 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:09.647792 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" Apr 22 19:16:09.708805 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:09.708783 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb2pr\" (UniqueName: \"kubernetes.io/projected/82617e7d-3a0c-415f-929f-cd0300062d31-kube-api-access-qb2pr\") pod \"82617e7d-3a0c-415f-929f-cd0300062d31\" (UID: \"82617e7d-3a0c-415f-929f-cd0300062d31\") " Apr 22 19:16:09.708909 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:09.708854 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82617e7d-3a0c-415f-929f-cd0300062d31-util\") pod \"82617e7d-3a0c-415f-929f-cd0300062d31\" (UID: \"82617e7d-3a0c-415f-929f-cd0300062d31\") " Apr 22 19:16:09.708909 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:09.708887 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82617e7d-3a0c-415f-929f-cd0300062d31-bundle\") pod \"82617e7d-3a0c-415f-929f-cd0300062d31\" (UID: \"82617e7d-3a0c-415f-929f-cd0300062d31\") " Apr 22 19:16:09.709751 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:09.709729 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82617e7d-3a0c-415f-929f-cd0300062d31-bundle" (OuterVolumeSpecName: "bundle") pod "82617e7d-3a0c-415f-929f-cd0300062d31" (UID: "82617e7d-3a0c-415f-929f-cd0300062d31"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:16:09.710645 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:09.710624 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82617e7d-3a0c-415f-929f-cd0300062d31-kube-api-access-qb2pr" (OuterVolumeSpecName: "kube-api-access-qb2pr") pod "82617e7d-3a0c-415f-929f-cd0300062d31" (UID: "82617e7d-3a0c-415f-929f-cd0300062d31"). InnerVolumeSpecName "kube-api-access-qb2pr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:16:09.714142 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:09.714119 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82617e7d-3a0c-415f-929f-cd0300062d31-util" (OuterVolumeSpecName: "util") pod "82617e7d-3a0c-415f-929f-cd0300062d31" (UID: "82617e7d-3a0c-415f-929f-cd0300062d31"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:16:09.809909 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:09.809872 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qb2pr\" (UniqueName: \"kubernetes.io/projected/82617e7d-3a0c-415f-929f-cd0300062d31-kube-api-access-qb2pr\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:16:09.809909 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:09.809903 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82617e7d-3a0c-415f-929f-cd0300062d31-util\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:16:09.809909 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:09.809916 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82617e7d-3a0c-415f-929f-cd0300062d31-bundle\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:16:10.485717 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:10.485667 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6" event={"ID":"b9b5dba6-27d1-4353-968a-0049be88faea","Type":"ContainerStarted","Data":"2e00b16eb0ff56c9104f061530531fb72d71cb94231ab177dd55c6aae61c10da"} Apr 22 19:16:10.485891 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:10.485814 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6" Apr 22 19:16:10.487261 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:10.487239 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" event={"ID":"82617e7d-3a0c-415f-929f-cd0300062d31","Type":"ContainerDied","Data":"71604459ec292285118f7dca7755b11cab6171bed4ee82472c3ab327ccc0377b"} Apr 22 19:16:10.487334 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:10.487264 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71604459ec292285118f7dca7755b11cab6171bed4ee82472c3ab327ccc0377b" Apr 22 19:16:10.487334 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:10.487264 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gttwt" Apr 22 19:16:10.508328 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:10.508280 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6" podStartSLOduration=2.046165909 podStartE2EDuration="4.508264538s" podCreationTimestamp="2026-04-22 19:16:06 +0000 UTC" firstStartedPulling="2026-04-22 19:16:07.218937741 +0000 UTC m=+591.899415575" lastFinishedPulling="2026-04-22 19:16:09.681036366 +0000 UTC m=+594.361514204" observedRunningTime="2026-04-22 19:16:10.506043832 +0000 UTC m=+595.186521700" watchObservedRunningTime="2026-04-22 19:16:10.508264538 +0000 UTC m=+595.188742392" Apr 22 19:16:13.347736 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.347697 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh"] Apr 22 19:16:13.348157 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.347950 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82617e7d-3a0c-415f-929f-cd0300062d31" containerName="pull" Apr 22 19:16:13.348157 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.347961 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="82617e7d-3a0c-415f-929f-cd0300062d31" containerName="pull" Apr 22 19:16:13.348157 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.347974 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82617e7d-3a0c-415f-929f-cd0300062d31" containerName="extract" Apr 22 19:16:13.348157 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.347979 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="82617e7d-3a0c-415f-929f-cd0300062d31" containerName="extract" Apr 22 19:16:13.348157 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.347988 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82617e7d-3a0c-415f-929f-cd0300062d31" containerName="util" Apr 22 19:16:13.348157 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.347994 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="82617e7d-3a0c-415f-929f-cd0300062d31" containerName="util" Apr 22 19:16:13.348157 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.348033 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="82617e7d-3a0c-415f-929f-cd0300062d31" containerName="extract" Apr 22 19:16:13.351231 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.351212 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" Apr 22 19:16:13.355574 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.355541 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 19:16:13.355869 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.355845 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-kpm8c\"" Apr 22 19:16:13.355957 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.355866 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 19:16:13.355957 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.355921 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 19:16:13.363899 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.363879 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh"] Apr 22 19:16:13.435377 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.435348 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/bc21ea98-0bde-4948-a397-97e39c96fd9a-metrics-cert\") pod \"lws-controller-manager-6769c56bf6-p5fkh\" (UID: \"bc21ea98-0bde-4948-a397-97e39c96fd9a\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" Apr 22 19:16:13.435519 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.435390 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2fc6\" (UniqueName: \"kubernetes.io/projected/bc21ea98-0bde-4948-a397-97e39c96fd9a-kube-api-access-g2fc6\") pod \"lws-controller-manager-6769c56bf6-p5fkh\" (UID: \"bc21ea98-0bde-4948-a397-97e39c96fd9a\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" Apr 22 19:16:13.435519 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.435433 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bc21ea98-0bde-4948-a397-97e39c96fd9a-manager-config\") pod \"lws-controller-manager-6769c56bf6-p5fkh\" (UID: \"bc21ea98-0bde-4948-a397-97e39c96fd9a\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" Apr 22 19:16:13.435519 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.435462 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc21ea98-0bde-4948-a397-97e39c96fd9a-cert\") pod \"lws-controller-manager-6769c56bf6-p5fkh\" (UID: \"bc21ea98-0bde-4948-a397-97e39c96fd9a\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" Apr 22 19:16:13.536107 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.536078 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bc21ea98-0bde-4948-a397-97e39c96fd9a-manager-config\") pod \"lws-controller-manager-6769c56bf6-p5fkh\" (UID: \"bc21ea98-0bde-4948-a397-97e39c96fd9a\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" Apr 22 19:16:13.536272 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.536115 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc21ea98-0bde-4948-a397-97e39c96fd9a-cert\") pod \"lws-controller-manager-6769c56bf6-p5fkh\" (UID: \"bc21ea98-0bde-4948-a397-97e39c96fd9a\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" Apr 22 19:16:13.536272 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.536134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/bc21ea98-0bde-4948-a397-97e39c96fd9a-metrics-cert\") pod \"lws-controller-manager-6769c56bf6-p5fkh\" (UID: \"bc21ea98-0bde-4948-a397-97e39c96fd9a\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" Apr 22 19:16:13.536391 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.536315 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2fc6\" (UniqueName: \"kubernetes.io/projected/bc21ea98-0bde-4948-a397-97e39c96fd9a-kube-api-access-g2fc6\") pod \"lws-controller-manager-6769c56bf6-p5fkh\" (UID: \"bc21ea98-0bde-4948-a397-97e39c96fd9a\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" Apr 22 19:16:13.536863 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.536840 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bc21ea98-0bde-4948-a397-97e39c96fd9a-manager-config\") pod \"lws-controller-manager-6769c56bf6-p5fkh\" (UID: \"bc21ea98-0bde-4948-a397-97e39c96fd9a\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" Apr 22 19:16:13.538593 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.538573 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc21ea98-0bde-4948-a397-97e39c96fd9a-cert\") pod \"lws-controller-manager-6769c56bf6-p5fkh\" (UID: \"bc21ea98-0bde-4948-a397-97e39c96fd9a\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" Apr 22 19:16:13.538685 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.538601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/bc21ea98-0bde-4948-a397-97e39c96fd9a-metrics-cert\") pod \"lws-controller-manager-6769c56bf6-p5fkh\" (UID: \"bc21ea98-0bde-4948-a397-97e39c96fd9a\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" Apr 22 19:16:13.548885 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.548850 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2fc6\" (UniqueName: \"kubernetes.io/projected/bc21ea98-0bde-4948-a397-97e39c96fd9a-kube-api-access-g2fc6\") pod \"lws-controller-manager-6769c56bf6-p5fkh\" (UID: \"bc21ea98-0bde-4948-a397-97e39c96fd9a\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" Apr 22 19:16:13.659941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.659872 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" Apr 22 19:16:13.784966 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:13.784939 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh"] Apr 22 19:16:13.787298 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:16:13.787271 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc21ea98_0bde_4948_a397_97e39c96fd9a.slice/crio-a2883abfe190dbc5d84c7e4b60534849542bc4db3b1f8d83d9679076cdf7fdf1 WatchSource:0}: Error finding container a2883abfe190dbc5d84c7e4b60534849542bc4db3b1f8d83d9679076cdf7fdf1: Status 404 returned error can't find the container with id a2883abfe190dbc5d84c7e4b60534849542bc4db3b1f8d83d9679076cdf7fdf1 Apr 22 19:16:14.502086 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:14.502049 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" event={"ID":"bc21ea98-0bde-4948-a397-97e39c96fd9a","Type":"ContainerStarted","Data":"a2883abfe190dbc5d84c7e4b60534849542bc4db3b1f8d83d9679076cdf7fdf1"} Apr 22 19:16:17.512901 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:17.512862 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" event={"ID":"bc21ea98-0bde-4948-a397-97e39c96fd9a","Type":"ContainerStarted","Data":"3e2248fd619b4b2289bac5392aee3fb85086bd285021c0cc2b9ba9272972bc1f"} Apr 22 19:16:17.513287 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:17.513079 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" Apr 22 19:16:21.492565 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:21.492522 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wqwc6" Apr 22 19:16:21.514897 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:21.514851 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" podStartSLOduration=5.7163577740000004 podStartE2EDuration="8.514836945s" podCreationTimestamp="2026-04-22 19:16:13 +0000 UTC" firstStartedPulling="2026-04-22 19:16:13.78899758 +0000 UTC m=+598.469475415" lastFinishedPulling="2026-04-22 19:16:16.58747675 +0000 UTC m=+601.267954586" observedRunningTime="2026-04-22 19:16:17.558496466 +0000 UTC m=+602.238974321" watchObservedRunningTime="2026-04-22 19:16:21.514836945 +0000 UTC m=+606.195314799" Apr 22 19:16:24.259276 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:24.259245 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q"] Apr 22 19:16:24.261484 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:24.261466 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" Apr 22 19:16:24.264060 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:24.264043 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 19:16:24.264159 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:24.264123 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 19:16:24.268142 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:24.268123 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lgbk5\"" Apr 22 19:16:24.275246 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:24.275225 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q"] Apr 22 19:16:24.319044 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:24.319022 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae114081-3d8c-47b6-ac39-b22a443f93e7-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q\" (UID: \"ae114081-3d8c-47b6-ac39-b22a443f93e7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" Apr 22 19:16:24.319180 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:24.319067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgn42\" (UniqueName: \"kubernetes.io/projected/ae114081-3d8c-47b6-ac39-b22a443f93e7-kube-api-access-lgn42\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q\" (UID: \"ae114081-3d8c-47b6-ac39-b22a443f93e7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" Apr 22 19:16:24.319180 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:24.319136 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae114081-3d8c-47b6-ac39-b22a443f93e7-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q\" (UID: \"ae114081-3d8c-47b6-ac39-b22a443f93e7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" Apr 22 19:16:24.419838 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:24.419809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae114081-3d8c-47b6-ac39-b22a443f93e7-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q\" (UID: \"ae114081-3d8c-47b6-ac39-b22a443f93e7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" Apr 22 19:16:24.419953 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:24.419854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgn42\" (UniqueName: \"kubernetes.io/projected/ae114081-3d8c-47b6-ac39-b22a443f93e7-kube-api-access-lgn42\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q\" (UID: \"ae114081-3d8c-47b6-ac39-b22a443f93e7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" Apr 22 19:16:24.419953 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:24.419883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae114081-3d8c-47b6-ac39-b22a443f93e7-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q\" (UID: \"ae114081-3d8c-47b6-ac39-b22a443f93e7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" Apr 22 19:16:24.420168 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:24.420148 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae114081-3d8c-47b6-ac39-b22a443f93e7-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q\" (UID: \"ae114081-3d8c-47b6-ac39-b22a443f93e7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" Apr 22 19:16:24.420243 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:24.420185 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae114081-3d8c-47b6-ac39-b22a443f93e7-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q\" (UID: \"ae114081-3d8c-47b6-ac39-b22a443f93e7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" Apr 22 19:16:24.441370 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:24.441341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgn42\" (UniqueName: \"kubernetes.io/projected/ae114081-3d8c-47b6-ac39-b22a443f93e7-kube-api-access-lgn42\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q\" (UID: \"ae114081-3d8c-47b6-ac39-b22a443f93e7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" Apr 22 19:16:24.570204 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:24.570146 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" Apr 22 19:16:24.692585 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:24.692559 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q"] Apr 22 19:16:24.695142 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:16:24.695116 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae114081_3d8c_47b6_ac39_b22a443f93e7.slice/crio-516300aa74af14e91d0e731606e71c3f0edc26d153e545d12874a0ec086b2869 WatchSource:0}: Error finding container 516300aa74af14e91d0e731606e71c3f0edc26d153e545d12874a0ec086b2869: Status 404 returned error can't find the container with id 516300aa74af14e91d0e731606e71c3f0edc26d153e545d12874a0ec086b2869 Apr 22 19:16:25.541273 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:25.541235 2576 generic.go:358] "Generic (PLEG): container finished" podID="ae114081-3d8c-47b6-ac39-b22a443f93e7" containerID="1e8fa79d55c3690de95f0d2d3fab148c28b46210f5c7c394ac372d4c19a6554d" exitCode=0 Apr 22 19:16:25.541652 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:25.541329 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" event={"ID":"ae114081-3d8c-47b6-ac39-b22a443f93e7","Type":"ContainerDied","Data":"1e8fa79d55c3690de95f0d2d3fab148c28b46210f5c7c394ac372d4c19a6554d"} Apr 22 19:16:25.541652 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:25.541368 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" event={"ID":"ae114081-3d8c-47b6-ac39-b22a443f93e7","Type":"ContainerStarted","Data":"516300aa74af14e91d0e731606e71c3f0edc26d153e545d12874a0ec086b2869"} Apr 22 19:16:28.518035 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:28.518004 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-p5fkh" Apr 22 19:16:34.574188 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:34.574111 2576 generic.go:358] "Generic (PLEG): container finished" podID="ae114081-3d8c-47b6-ac39-b22a443f93e7" containerID="e9fdd9cc26acf2da4e29ddc7728c3b0fb8210138f996b178057e9f357a7b376b" exitCode=0 Apr 22 19:16:34.574188 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:34.574170 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" event={"ID":"ae114081-3d8c-47b6-ac39-b22a443f93e7","Type":"ContainerDied","Data":"e9fdd9cc26acf2da4e29ddc7728c3b0fb8210138f996b178057e9f357a7b376b"} Apr 22 19:16:35.579331 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:35.579302 2576 generic.go:358] "Generic (PLEG): container finished" podID="ae114081-3d8c-47b6-ac39-b22a443f93e7" containerID="49ff70c9597e9d3fa4f4ed30998c71be1caf8def02ab5782d56adf3f794d7376" exitCode=0 Apr 22 19:16:35.579710 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:35.579374 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" event={"ID":"ae114081-3d8c-47b6-ac39-b22a443f93e7","Type":"ContainerDied","Data":"49ff70c9597e9d3fa4f4ed30998c71be1caf8def02ab5782d56adf3f794d7376"} Apr 22 19:16:36.697866 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:36.697845 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" Apr 22 19:16:36.800106 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:36.800070 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae114081-3d8c-47b6-ac39-b22a443f93e7-bundle\") pod \"ae114081-3d8c-47b6-ac39-b22a443f93e7\" (UID: \"ae114081-3d8c-47b6-ac39-b22a443f93e7\") " Apr 22 19:16:36.800106 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:36.800114 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae114081-3d8c-47b6-ac39-b22a443f93e7-util\") pod \"ae114081-3d8c-47b6-ac39-b22a443f93e7\" (UID: \"ae114081-3d8c-47b6-ac39-b22a443f93e7\") " Apr 22 19:16:36.800261 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:36.800135 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgn42\" (UniqueName: \"kubernetes.io/projected/ae114081-3d8c-47b6-ac39-b22a443f93e7-kube-api-access-lgn42\") pod \"ae114081-3d8c-47b6-ac39-b22a443f93e7\" (UID: \"ae114081-3d8c-47b6-ac39-b22a443f93e7\") " Apr 22 19:16:36.801023 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:36.801000 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae114081-3d8c-47b6-ac39-b22a443f93e7-bundle" (OuterVolumeSpecName: "bundle") pod "ae114081-3d8c-47b6-ac39-b22a443f93e7" (UID: "ae114081-3d8c-47b6-ac39-b22a443f93e7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:16:36.802112 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:36.802084 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae114081-3d8c-47b6-ac39-b22a443f93e7-kube-api-access-lgn42" (OuterVolumeSpecName: "kube-api-access-lgn42") pod "ae114081-3d8c-47b6-ac39-b22a443f93e7" (UID: "ae114081-3d8c-47b6-ac39-b22a443f93e7"). InnerVolumeSpecName "kube-api-access-lgn42". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:16:36.804608 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:36.804562 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae114081-3d8c-47b6-ac39-b22a443f93e7-util" (OuterVolumeSpecName: "util") pod "ae114081-3d8c-47b6-ac39-b22a443f93e7" (UID: "ae114081-3d8c-47b6-ac39-b22a443f93e7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:16:36.900580 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:36.900517 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lgn42\" (UniqueName: \"kubernetes.io/projected/ae114081-3d8c-47b6-ac39-b22a443f93e7-kube-api-access-lgn42\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:16:36.900580 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:36.900537 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae114081-3d8c-47b6-ac39-b22a443f93e7-bundle\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:16:36.900687 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:36.900583 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae114081-3d8c-47b6-ac39-b22a443f93e7-util\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:16:37.586351 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:37.586318 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" Apr 22 19:16:37.586351 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:37.586328 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358jg4q" event={"ID":"ae114081-3d8c-47b6-ac39-b22a443f93e7","Type":"ContainerDied","Data":"516300aa74af14e91d0e731606e71c3f0edc26d153e545d12874a0ec086b2869"} Apr 22 19:16:37.586534 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:37.586363 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="516300aa74af14e91d0e731606e71c3f0edc26d153e545d12874a0ec086b2869" Apr 22 19:16:43.377477 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.377443 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c"] Apr 22 19:16:43.377867 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.377725 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae114081-3d8c-47b6-ac39-b22a443f93e7" containerName="util" Apr 22 19:16:43.377867 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.377740 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae114081-3d8c-47b6-ac39-b22a443f93e7" containerName="util" Apr 22 19:16:43.377867 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.377749 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae114081-3d8c-47b6-ac39-b22a443f93e7" containerName="extract" Apr 22 19:16:43.377867 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.377755 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae114081-3d8c-47b6-ac39-b22a443f93e7" containerName="extract" Apr 22 19:16:43.377867 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.377769 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae114081-3d8c-47b6-ac39-b22a443f93e7" containerName="pull" Apr 22 19:16:43.377867 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.377775 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae114081-3d8c-47b6-ac39-b22a443f93e7" containerName="pull" Apr 22 19:16:43.377867 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.377838 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae114081-3d8c-47b6-ac39-b22a443f93e7" containerName="extract" Apr 22 19:16:43.386900 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.386881 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" Apr 22 19:16:43.390506 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.390482 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 19:16:43.390731 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.390717 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 19:16:43.391510 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.391490 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lgbk5\"" Apr 22 19:16:43.395986 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.395966 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c"] Apr 22 19:16:43.446508 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.446482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c\" (UID: \"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" Apr 22 19:16:43.446620 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.446513 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xrm9\" (UniqueName: \"kubernetes.io/projected/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-kube-api-access-2xrm9\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c\" (UID: \"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" Apr 22 19:16:43.446620 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.446531 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c\" (UID: \"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" Apr 22 19:16:43.547291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.547268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c\" (UID: \"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" Apr 22 19:16:43.547364 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.547294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xrm9\" (UniqueName: \"kubernetes.io/projected/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-kube-api-access-2xrm9\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c\" (UID: \"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" Apr 22 19:16:43.547364 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.547311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c\" (UID: \"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" Apr 22 19:16:43.547618 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.547602 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c\" (UID: \"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" Apr 22 19:16:43.547652 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.547623 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c\" (UID: \"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" Apr 22 19:16:43.569131 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.569105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xrm9\" (UniqueName: \"kubernetes.io/projected/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-kube-api-access-2xrm9\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c\" (UID: \"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" Apr 22 19:16:43.696139 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.696085 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" Apr 22 19:16:43.820390 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:43.820368 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c"] Apr 22 19:16:43.822112 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:16:43.822088 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bbb6e61_c3df_48e0_a30a_e452bcae3ed0.slice/crio-8b2c0326c3ddd357f557a54660efd61809929e7edae5d4216885a0ab7a2e32e7 WatchSource:0}: Error finding container 8b2c0326c3ddd357f557a54660efd61809929e7edae5d4216885a0ab7a2e32e7: Status 404 returned error can't find the container with id 8b2c0326c3ddd357f557a54660efd61809929e7edae5d4216885a0ab7a2e32e7 Apr 22 19:16:44.614345 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:44.614310 2576 generic.go:358] "Generic (PLEG): container finished" podID="4bbb6e61-c3df-48e0-a30a-e452bcae3ed0" containerID="8893ceeb9f7aaa1f7e204fd2aeec81c45e01a5ff50d189ee521e481e90e2d873" exitCode=0 Apr 22 19:16:44.614687 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:44.614400 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" event={"ID":"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0","Type":"ContainerDied","Data":"8893ceeb9f7aaa1f7e204fd2aeec81c45e01a5ff50d189ee521e481e90e2d873"} Apr 22 19:16:44.614687 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:44.614437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" event={"ID":"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0","Type":"ContainerStarted","Data":"8b2c0326c3ddd357f557a54660efd61809929e7edae5d4216885a0ab7a2e32e7"} Apr 22 19:16:47.626138 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:47.626108 2576 generic.go:358] "Generic (PLEG): container finished" podID="4bbb6e61-c3df-48e0-a30a-e452bcae3ed0" containerID="979225da92f4edb84c89b7b0f981f37f33bc07f1750f964a8ebda0a054a2d7c3" exitCode=0 Apr 22 19:16:47.626498 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:47.626168 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" event={"ID":"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0","Type":"ContainerDied","Data":"979225da92f4edb84c89b7b0f981f37f33bc07f1750f964a8ebda0a054a2d7c3"} Apr 22 19:16:48.630924 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:48.630891 2576 generic.go:358] "Generic (PLEG): container finished" podID="4bbb6e61-c3df-48e0-a30a-e452bcae3ed0" containerID="225790797d5b1d6851e1e3f4ae8e32e8f689b78b0c551fd1e49e513f25990dbd" exitCode=0 Apr 22 19:16:48.631328 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:48.630957 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" event={"ID":"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0","Type":"ContainerDied","Data":"225790797d5b1d6851e1e3f4ae8e32e8f689b78b0c551fd1e49e513f25990dbd"} Apr 22 19:16:49.752243 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:49.752219 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" Apr 22 19:16:49.897928 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:49.897866 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xrm9\" (UniqueName: \"kubernetes.io/projected/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-kube-api-access-2xrm9\") pod \"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0\" (UID: \"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0\") " Apr 22 19:16:49.897928 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:49.897922 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-util\") pod \"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0\" (UID: \"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0\") " Apr 22 19:16:49.898079 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:49.897965 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-bundle\") pod \"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0\" (UID: \"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0\") " Apr 22 19:16:49.898767 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:49.898733 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-bundle" (OuterVolumeSpecName: "bundle") pod "4bbb6e61-c3df-48e0-a30a-e452bcae3ed0" (UID: "4bbb6e61-c3df-48e0-a30a-e452bcae3ed0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:16:49.899873 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:49.899848 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-kube-api-access-2xrm9" (OuterVolumeSpecName: "kube-api-access-2xrm9") pod "4bbb6e61-c3df-48e0-a30a-e452bcae3ed0" (UID: "4bbb6e61-c3df-48e0-a30a-e452bcae3ed0"). InnerVolumeSpecName "kube-api-access-2xrm9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:16:49.902529 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:49.902509 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-util" (OuterVolumeSpecName: "util") pod "4bbb6e61-c3df-48e0-a30a-e452bcae3ed0" (UID: "4bbb6e61-c3df-48e0-a30a-e452bcae3ed0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:16:49.998658 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:49.998636 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-util\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:16:49.998658 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:49.998655 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-bundle\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:16:49.998826 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:49.998665 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2xrm9\" (UniqueName: \"kubernetes.io/projected/4bbb6e61-c3df-48e0-a30a-e452bcae3ed0-kube-api-access-2xrm9\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:16:50.639656 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:50.639624 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" Apr 22 19:16:50.639821 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:50.639625 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebc9s9c" event={"ID":"4bbb6e61-c3df-48e0-a30a-e452bcae3ed0","Type":"ContainerDied","Data":"8b2c0326c3ddd357f557a54660efd61809929e7edae5d4216885a0ab7a2e32e7"} Apr 22 19:16:50.639821 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:50.639735 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b2c0326c3ddd357f557a54660efd61809929e7edae5d4216885a0ab7a2e32e7" Apr 22 19:16:53.159429 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.159396 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t"] Apr 22 19:16:53.160726 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.159702 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bbb6e61-c3df-48e0-a30a-e452bcae3ed0" containerName="util" Apr 22 19:16:53.160726 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.159714 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bbb6e61-c3df-48e0-a30a-e452bcae3ed0" containerName="util" Apr 22 19:16:53.160726 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.159722 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bbb6e61-c3df-48e0-a30a-e452bcae3ed0" containerName="pull" Apr 22 19:16:53.160726 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.159727 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bbb6e61-c3df-48e0-a30a-e452bcae3ed0" containerName="pull" Apr 22 19:16:53.160726 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.159733 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bbb6e61-c3df-48e0-a30a-e452bcae3ed0" containerName="extract" Apr 22 19:16:53.160726 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.159738 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bbb6e61-c3df-48e0-a30a-e452bcae3ed0" containerName="extract" Apr 22 19:16:53.160726 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.159779 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4bbb6e61-c3df-48e0-a30a-e452bcae3ed0" containerName="extract" Apr 22 19:16:53.161352 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.161337 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.164699 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.164679 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 19:16:53.164812 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.164719 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 19:16:53.164812 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.164777 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-lk48x\"" Apr 22 19:16:53.164931 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.164861 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 19:16:53.187995 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.187966 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t"] Apr 22 19:16:53.321162 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.321128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/778ca226-228b-49c6-af36-d9c52e4ed5e0-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.321162 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.321163 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/778ca226-228b-49c6-af36-d9c52e4ed5e0-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.321354 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.321185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/778ca226-228b-49c6-af36-d9c52e4ed5e0-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.321354 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.321213 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/778ca226-228b-49c6-af36-d9c52e4ed5e0-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.321354 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.321247 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/778ca226-228b-49c6-af36-d9c52e4ed5e0-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.321354 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.321270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/778ca226-228b-49c6-af36-d9c52e4ed5e0-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.321354 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.321324 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/778ca226-228b-49c6-af36-d9c52e4ed5e0-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.321521 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.321372 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/778ca226-228b-49c6-af36-d9c52e4ed5e0-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.321521 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.321409 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lv6d\" (UniqueName: \"kubernetes.io/projected/778ca226-228b-49c6-af36-d9c52e4ed5e0-kube-api-access-4lv6d\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.422874 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.422788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/778ca226-228b-49c6-af36-d9c52e4ed5e0-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.422874 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.422835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/778ca226-228b-49c6-af36-d9c52e4ed5e0-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.423086 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.422878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/778ca226-228b-49c6-af36-d9c52e4ed5e0-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.423086 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.422913 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/778ca226-228b-49c6-af36-d9c52e4ed5e0-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.423086 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.422938 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/778ca226-228b-49c6-af36-d9c52e4ed5e0-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.423086 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.422965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/778ca226-228b-49c6-af36-d9c52e4ed5e0-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.423086 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.422996 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lv6d\" (UniqueName: \"kubernetes.io/projected/778ca226-228b-49c6-af36-d9c52e4ed5e0-kube-api-access-4lv6d\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.423086 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.423077 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/778ca226-228b-49c6-af36-d9c52e4ed5e0-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.423378 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.423121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/778ca226-228b-49c6-af36-d9c52e4ed5e0-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.423435 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.423371 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/778ca226-228b-49c6-af36-d9c52e4ed5e0-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.423435 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.423416 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/778ca226-228b-49c6-af36-d9c52e4ed5e0-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.423538 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.423462 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/778ca226-228b-49c6-af36-d9c52e4ed5e0-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.423630 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.423608 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/778ca226-228b-49c6-af36-d9c52e4ed5e0-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.423705 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.423688 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/778ca226-228b-49c6-af36-d9c52e4ed5e0-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.425325 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.425305 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/778ca226-228b-49c6-af36-d9c52e4ed5e0-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.425514 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.425496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/778ca226-228b-49c6-af36-d9c52e4ed5e0-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.433789 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.433769 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/778ca226-228b-49c6-af36-d9c52e4ed5e0-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.434138 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.434120 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lv6d\" (UniqueName: \"kubernetes.io/projected/778ca226-228b-49c6-af36-d9c52e4ed5e0-kube-api-access-4lv6d\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t\" (UID: \"778ca226-228b-49c6-af36-d9c52e4ed5e0\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.471194 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.471160 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:53.602633 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.602610 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t"] Apr 22 19:16:53.604758 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:16:53.604729 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod778ca226_228b_49c6_af36_d9c52e4ed5e0.slice/crio-44bff410cbb9465d0f23039f4640e086073e9db26117191b6fcf087d7aaa85f0 WatchSource:0}: Error finding container 44bff410cbb9465d0f23039f4640e086073e9db26117191b6fcf087d7aaa85f0: Status 404 returned error can't find the container with id 44bff410cbb9465d0f23039f4640e086073e9db26117191b6fcf087d7aaa85f0 Apr 22 19:16:53.650092 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:53.650058 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" event={"ID":"778ca226-228b-49c6-af36-d9c52e4ed5e0","Type":"ContainerStarted","Data":"44bff410cbb9465d0f23039f4640e086073e9db26117191b6fcf087d7aaa85f0"} Apr 22 19:16:56.393011 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:56.392969 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 19:16:56.393286 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:56.393058 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 19:16:56.393286 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:56.393098 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 19:16:56.660738 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:56.660662 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" event={"ID":"778ca226-228b-49c6-af36-d9c52e4ed5e0","Type":"ContainerStarted","Data":"fb99f60000b195746d51f6f5c18e485254b502758787088d2b26463ef6972c2b"} Apr 22 19:16:56.686474 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:56.686413 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" podStartSLOduration=0.900224591 podStartE2EDuration="3.686395456s" podCreationTimestamp="2026-04-22 19:16:53 +0000 UTC" firstStartedPulling="2026-04-22 19:16:53.606580878 +0000 UTC m=+638.287058715" lastFinishedPulling="2026-04-22 19:16:56.392751732 +0000 UTC m=+641.073229580" observedRunningTime="2026-04-22 19:16:56.683315381 +0000 UTC m=+641.363793239" watchObservedRunningTime="2026-04-22 19:16:56.686395456 +0000 UTC m=+641.366873312" Apr 22 19:16:57.471436 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:57.471404 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:57.475894 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:57.475865 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:57.668821 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:57.666070 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:16:57.669716 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:16:57.669688 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t" Apr 22 19:17:25.066739 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.066709 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dv4rw"] Apr 22 19:17:25.071582 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.071540 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dv4rw" Apr 22 19:17:25.075721 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.075699 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 19:17:25.075822 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.075699 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 19:17:25.076813 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.076792 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-v2zgn\"" Apr 22 19:17:25.085957 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.085936 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dv4rw"] Apr 22 19:17:25.155486 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.155460 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4wm9\" (UniqueName: \"kubernetes.io/projected/d002fbac-8fae-4ab3-8db2-e36c72d8ae57-kube-api-access-p4wm9\") pod \"kuadrant-operator-catalog-dv4rw\" (UID: \"d002fbac-8fae-4ab3-8db2-e36c72d8ae57\") " pod="kuadrant-system/kuadrant-operator-catalog-dv4rw" Apr 22 19:17:25.256152 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.256124 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4wm9\" (UniqueName: \"kubernetes.io/projected/d002fbac-8fae-4ab3-8db2-e36c72d8ae57-kube-api-access-p4wm9\") pod \"kuadrant-operator-catalog-dv4rw\" (UID: \"d002fbac-8fae-4ab3-8db2-e36c72d8ae57\") " pod="kuadrant-system/kuadrant-operator-catalog-dv4rw" Apr 22 19:17:25.265866 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.265839 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4wm9\" (UniqueName: \"kubernetes.io/projected/d002fbac-8fae-4ab3-8db2-e36c72d8ae57-kube-api-access-p4wm9\") pod \"kuadrant-operator-catalog-dv4rw\" (UID: \"d002fbac-8fae-4ab3-8db2-e36c72d8ae57\") " pod="kuadrant-system/kuadrant-operator-catalog-dv4rw" Apr 22 19:17:25.380368 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.380296 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dv4rw" Apr 22 19:17:25.421474 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.421444 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dv4rw"] Apr 22 19:17:25.525182 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.525158 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dv4rw"] Apr 22 19:17:25.527852 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:17:25.527823 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd002fbac_8fae_4ab3_8db2_e36c72d8ae57.slice/crio-a81eb353e6f2cd0737bd92879b446cd559bad786365a162993d1fcc94b054b49 WatchSource:0}: Error finding container a81eb353e6f2cd0737bd92879b446cd559bad786365a162993d1fcc94b054b49: Status 404 returned error can't find the container with id a81eb353e6f2cd0737bd92879b446cd559bad786365a162993d1fcc94b054b49 Apr 22 19:17:25.631807 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.631743 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bwlnk"] Apr 22 19:17:25.636502 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.636485 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bwlnk" Apr 22 19:17:25.644078 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.644059 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bwlnk"] Apr 22 19:17:25.659768 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.659744 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vztd\" (UniqueName: \"kubernetes.io/projected/b8c23efc-a33b-4301-a2cd-a5e76299c388-kube-api-access-2vztd\") pod \"kuadrant-operator-catalog-bwlnk\" (UID: \"b8c23efc-a33b-4301-a2cd-a5e76299c388\") " pod="kuadrant-system/kuadrant-operator-catalog-bwlnk" Apr 22 19:17:25.752272 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.752239 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dv4rw" event={"ID":"d002fbac-8fae-4ab3-8db2-e36c72d8ae57","Type":"ContainerStarted","Data":"a81eb353e6f2cd0737bd92879b446cd559bad786365a162993d1fcc94b054b49"} Apr 22 19:17:25.760626 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.760606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vztd\" (UniqueName: \"kubernetes.io/projected/b8c23efc-a33b-4301-a2cd-a5e76299c388-kube-api-access-2vztd\") pod \"kuadrant-operator-catalog-bwlnk\" (UID: \"b8c23efc-a33b-4301-a2cd-a5e76299c388\") " pod="kuadrant-system/kuadrant-operator-catalog-bwlnk" Apr 22 19:17:25.769092 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.769075 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vztd\" (UniqueName: \"kubernetes.io/projected/b8c23efc-a33b-4301-a2cd-a5e76299c388-kube-api-access-2vztd\") pod \"kuadrant-operator-catalog-bwlnk\" (UID: \"b8c23efc-a33b-4301-a2cd-a5e76299c388\") " pod="kuadrant-system/kuadrant-operator-catalog-bwlnk" Apr 22 19:17:25.946606 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:25.946519 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bwlnk" Apr 22 19:17:26.060604 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:26.060578 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bwlnk"] Apr 22 19:17:26.062819 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:17:26.062777 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8c23efc_a33b_4301_a2cd_a5e76299c388.slice/crio-620f9eb9bff48bc715e48c28f5939c82e7fb5230e8dcd3ae412a338e7fdb3e7c WatchSource:0}: Error finding container 620f9eb9bff48bc715e48c28f5939c82e7fb5230e8dcd3ae412a338e7fdb3e7c: Status 404 returned error can't find the container with id 620f9eb9bff48bc715e48c28f5939c82e7fb5230e8dcd3ae412a338e7fdb3e7c Apr 22 19:17:26.758326 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:26.758288 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bwlnk" event={"ID":"b8c23efc-a33b-4301-a2cd-a5e76299c388","Type":"ContainerStarted","Data":"620f9eb9bff48bc715e48c28f5939c82e7fb5230e8dcd3ae412a338e7fdb3e7c"} Apr 22 19:17:28.766341 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:28.766301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bwlnk" event={"ID":"b8c23efc-a33b-4301-a2cd-a5e76299c388","Type":"ContainerStarted","Data":"840ba26534ad6854acc66d00d0daf94f0cc7fadc2752ec19746958ceafc5ddb1"} Apr 22 19:17:28.767692 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:28.767666 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dv4rw" event={"ID":"d002fbac-8fae-4ab3-8db2-e36c72d8ae57","Type":"ContainerStarted","Data":"858dea1bc49bf7d837f7793679acf1767ca38cb2b558c0ea104dfc02f9ea4b77"} Apr 22 19:17:28.767798 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:28.767720 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-dv4rw" podUID="d002fbac-8fae-4ab3-8db2-e36c72d8ae57" containerName="registry-server" containerID="cri-o://858dea1bc49bf7d837f7793679acf1767ca38cb2b558c0ea104dfc02f9ea4b77" gracePeriod=2 Apr 22 19:17:28.784291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:28.784239 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-bwlnk" podStartSLOduration=2.119081234 podStartE2EDuration="3.784225674s" podCreationTimestamp="2026-04-22 19:17:25 +0000 UTC" firstStartedPulling="2026-04-22 19:17:26.064065971 +0000 UTC m=+670.744543804" lastFinishedPulling="2026-04-22 19:17:27.729210395 +0000 UTC m=+672.409688244" observedRunningTime="2026-04-22 19:17:28.78124529 +0000 UTC m=+673.461723155" watchObservedRunningTime="2026-04-22 19:17:28.784225674 +0000 UTC m=+673.464703529" Apr 22 19:17:28.800033 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:28.799983 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-dv4rw" podStartSLOduration=1.602508268 podStartE2EDuration="3.799971927s" podCreationTimestamp="2026-04-22 19:17:25 +0000 UTC" firstStartedPulling="2026-04-22 19:17:25.529116956 +0000 UTC m=+670.209594793" lastFinishedPulling="2026-04-22 19:17:27.726580615 +0000 UTC m=+672.407058452" observedRunningTime="2026-04-22 19:17:28.797667125 +0000 UTC m=+673.478144981" watchObservedRunningTime="2026-04-22 19:17:28.799971927 +0000 UTC m=+673.480449784" Apr 22 19:17:29.001196 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:29.001175 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dv4rw" Apr 22 19:17:29.087766 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:29.087699 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4wm9\" (UniqueName: \"kubernetes.io/projected/d002fbac-8fae-4ab3-8db2-e36c72d8ae57-kube-api-access-p4wm9\") pod \"d002fbac-8fae-4ab3-8db2-e36c72d8ae57\" (UID: \"d002fbac-8fae-4ab3-8db2-e36c72d8ae57\") " Apr 22 19:17:29.089789 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:29.089764 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d002fbac-8fae-4ab3-8db2-e36c72d8ae57-kube-api-access-p4wm9" (OuterVolumeSpecName: "kube-api-access-p4wm9") pod "d002fbac-8fae-4ab3-8db2-e36c72d8ae57" (UID: "d002fbac-8fae-4ab3-8db2-e36c72d8ae57"). InnerVolumeSpecName "kube-api-access-p4wm9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:17:29.188188 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:29.188161 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p4wm9\" (UniqueName: \"kubernetes.io/projected/d002fbac-8fae-4ab3-8db2-e36c72d8ae57-kube-api-access-p4wm9\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:17:29.772496 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:29.772452 2576 generic.go:358] "Generic (PLEG): container finished" podID="d002fbac-8fae-4ab3-8db2-e36c72d8ae57" containerID="858dea1bc49bf7d837f7793679acf1767ca38cb2b558c0ea104dfc02f9ea4b77" exitCode=0 Apr 22 19:17:29.772983 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:29.772521 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dv4rw" Apr 22 19:17:29.772983 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:29.772564 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dv4rw" event={"ID":"d002fbac-8fae-4ab3-8db2-e36c72d8ae57","Type":"ContainerDied","Data":"858dea1bc49bf7d837f7793679acf1767ca38cb2b558c0ea104dfc02f9ea4b77"} Apr 22 19:17:29.772983 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:29.772606 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dv4rw" event={"ID":"d002fbac-8fae-4ab3-8db2-e36c72d8ae57","Type":"ContainerDied","Data":"a81eb353e6f2cd0737bd92879b446cd559bad786365a162993d1fcc94b054b49"} Apr 22 19:17:29.772983 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:29.772633 2576 scope.go:117] "RemoveContainer" containerID="858dea1bc49bf7d837f7793679acf1767ca38cb2b558c0ea104dfc02f9ea4b77" Apr 22 19:17:29.781750 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:29.781731 2576 scope.go:117] "RemoveContainer" containerID="858dea1bc49bf7d837f7793679acf1767ca38cb2b558c0ea104dfc02f9ea4b77" Apr 22 19:17:29.781990 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:17:29.781968 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"858dea1bc49bf7d837f7793679acf1767ca38cb2b558c0ea104dfc02f9ea4b77\": container with ID starting with 858dea1bc49bf7d837f7793679acf1767ca38cb2b558c0ea104dfc02f9ea4b77 not found: ID does not exist" containerID="858dea1bc49bf7d837f7793679acf1767ca38cb2b558c0ea104dfc02f9ea4b77" Apr 22 19:17:29.782041 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:29.781999 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"858dea1bc49bf7d837f7793679acf1767ca38cb2b558c0ea104dfc02f9ea4b77"} err="failed to get container status \"858dea1bc49bf7d837f7793679acf1767ca38cb2b558c0ea104dfc02f9ea4b77\": rpc error: code = NotFound desc = could not find container \"858dea1bc49bf7d837f7793679acf1767ca38cb2b558c0ea104dfc02f9ea4b77\": container with ID starting with 858dea1bc49bf7d837f7793679acf1767ca38cb2b558c0ea104dfc02f9ea4b77 not found: ID does not exist" Apr 22 19:17:29.795421 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:29.795400 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dv4rw"] Apr 22 19:17:29.800694 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:29.800677 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dv4rw"] Apr 22 19:17:29.883201 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:29.883178 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d002fbac-8fae-4ab3-8db2-e36c72d8ae57" path="/var/lib/kubelet/pods/d002fbac-8fae-4ab3-8db2-e36c72d8ae57/volumes" Apr 22 19:17:35.947008 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:35.946975 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-bwlnk" Apr 22 19:17:35.947008 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:35.947007 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-bwlnk" Apr 22 19:17:35.967782 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:35.967759 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-bwlnk" Apr 22 19:17:36.816339 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:36.816310 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-bwlnk" Apr 22 19:17:40.613961 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:40.613927 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w"] Apr 22 19:17:40.614343 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:40.614223 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d002fbac-8fae-4ab3-8db2-e36c72d8ae57" containerName="registry-server" Apr 22 19:17:40.614343 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:40.614236 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d002fbac-8fae-4ab3-8db2-e36c72d8ae57" containerName="registry-server" Apr 22 19:17:40.614343 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:40.614304 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d002fbac-8fae-4ab3-8db2-e36c72d8ae57" containerName="registry-server" Apr 22 19:17:40.617220 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:40.617204 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" Apr 22 19:17:40.619598 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:40.619579 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-6vvpl\"" Apr 22 19:17:40.628479 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:40.628458 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w"] Apr 22 19:17:40.674219 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:40.674189 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rdvl\" (UniqueName: \"kubernetes.io/projected/dcea311f-afff-4b14-874c-c48c8cfda339-kube-api-access-5rdvl\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w\" (UID: \"dcea311f-afff-4b14-874c-c48c8cfda339\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" Apr 22 19:17:40.674337 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:40.674223 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcea311f-afff-4b14-874c-c48c8cfda339-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w\" (UID: \"dcea311f-afff-4b14-874c-c48c8cfda339\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" Apr 22 19:17:40.674337 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:40.674245 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcea311f-afff-4b14-874c-c48c8cfda339-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w\" (UID: \"dcea311f-afff-4b14-874c-c48c8cfda339\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" Apr 22 19:17:40.774676 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:40.774650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rdvl\" (UniqueName: \"kubernetes.io/projected/dcea311f-afff-4b14-874c-c48c8cfda339-kube-api-access-5rdvl\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w\" (UID: \"dcea311f-afff-4b14-874c-c48c8cfda339\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" Apr 22 19:17:40.774676 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:40.774679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcea311f-afff-4b14-874c-c48c8cfda339-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w\" (UID: \"dcea311f-afff-4b14-874c-c48c8cfda339\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" Apr 22 19:17:40.774833 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:40.774701 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcea311f-afff-4b14-874c-c48c8cfda339-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w\" (UID: \"dcea311f-afff-4b14-874c-c48c8cfda339\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" Apr 22 19:17:40.775032 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:40.775017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcea311f-afff-4b14-874c-c48c8cfda339-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w\" (UID: \"dcea311f-afff-4b14-874c-c48c8cfda339\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" Apr 22 19:17:40.775074 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:40.775043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcea311f-afff-4b14-874c-c48c8cfda339-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w\" (UID: \"dcea311f-afff-4b14-874c-c48c8cfda339\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" Apr 22 19:17:40.784211 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:40.784182 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rdvl\" (UniqueName: \"kubernetes.io/projected/dcea311f-afff-4b14-874c-c48c8cfda339-kube-api-access-5rdvl\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w\" (UID: \"dcea311f-afff-4b14-874c-c48c8cfda339\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" Apr 22 19:17:40.925986 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:40.925918 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" Apr 22 19:17:41.042759 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.042732 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w"] Apr 22 19:17:41.045117 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:17:41.045080 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcea311f_afff_4b14_874c_c48c8cfda339.slice/crio-5f1463f79e0193199f316047615fb7b3958029dcc358549411c82dc83ef6f618 WatchSource:0}: Error finding container 5f1463f79e0193199f316047615fb7b3958029dcc358549411c82dc83ef6f618: Status 404 returned error can't find the container with id 5f1463f79e0193199f316047615fb7b3958029dcc358549411c82dc83ef6f618 Apr 22 19:17:41.417525 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.417497 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl"] Apr 22 19:17:41.420727 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.420711 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" Apr 22 19:17:41.428892 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.428868 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl"] Apr 22 19:17:41.481196 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.481166 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64kqr\" (UniqueName: \"kubernetes.io/projected/36838a31-9166-433f-a56b-d65afe7fccc0-kube-api-access-64kqr\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl\" (UID: \"36838a31-9166-433f-a56b-d65afe7fccc0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" Apr 22 19:17:41.481328 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.481209 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36838a31-9166-433f-a56b-d65afe7fccc0-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl\" (UID: \"36838a31-9166-433f-a56b-d65afe7fccc0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" Apr 22 19:17:41.481328 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.481273 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36838a31-9166-433f-a56b-d65afe7fccc0-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl\" (UID: \"36838a31-9166-433f-a56b-d65afe7fccc0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" Apr 22 19:17:41.582118 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.582088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36838a31-9166-433f-a56b-d65afe7fccc0-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl\" (UID: \"36838a31-9166-433f-a56b-d65afe7fccc0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" Apr 22 19:17:41.582240 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.582141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64kqr\" (UniqueName: \"kubernetes.io/projected/36838a31-9166-433f-a56b-d65afe7fccc0-kube-api-access-64kqr\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl\" (UID: \"36838a31-9166-433f-a56b-d65afe7fccc0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" Apr 22 19:17:41.582240 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.582167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36838a31-9166-433f-a56b-d65afe7fccc0-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl\" (UID: \"36838a31-9166-433f-a56b-d65afe7fccc0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" Apr 22 19:17:41.582460 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.582442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36838a31-9166-433f-a56b-d65afe7fccc0-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl\" (UID: \"36838a31-9166-433f-a56b-d65afe7fccc0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" Apr 22 19:17:41.582494 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.582467 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36838a31-9166-433f-a56b-d65afe7fccc0-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl\" (UID: \"36838a31-9166-433f-a56b-d65afe7fccc0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" Apr 22 19:17:41.590663 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.590641 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64kqr\" (UniqueName: \"kubernetes.io/projected/36838a31-9166-433f-a56b-d65afe7fccc0-kube-api-access-64kqr\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl\" (UID: \"36838a31-9166-433f-a56b-d65afe7fccc0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" Apr 22 19:17:41.730064 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.729988 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" Apr 22 19:17:41.813952 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.813921 2576 generic.go:358] "Generic (PLEG): container finished" podID="dcea311f-afff-4b14-874c-c48c8cfda339" containerID="c4aa92e95fcc45c3c83f5e250f2428364ac22bc741b1acb55e272887e1d65e5c" exitCode=0 Apr 22 19:17:41.814064 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.813999 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" event={"ID":"dcea311f-afff-4b14-874c-c48c8cfda339","Type":"ContainerDied","Data":"c4aa92e95fcc45c3c83f5e250f2428364ac22bc741b1acb55e272887e1d65e5c"} Apr 22 19:17:41.814064 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.814029 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" event={"ID":"dcea311f-afff-4b14-874c-c48c8cfda339","Type":"ContainerStarted","Data":"5f1463f79e0193199f316047615fb7b3958029dcc358549411c82dc83ef6f618"} Apr 22 19:17:41.849166 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:41.849140 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl"] Apr 22 19:17:41.850795 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:17:41.850767 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36838a31_9166_433f_a56b_d65afe7fccc0.slice/crio-8113993717aebc3377bfc82aaa25c5621ec52ad3f5d9752e2e516a1d643b966f WatchSource:0}: Error finding container 8113993717aebc3377bfc82aaa25c5621ec52ad3f5d9752e2e516a1d643b966f: Status 404 returned error can't find the container with id 8113993717aebc3377bfc82aaa25c5621ec52ad3f5d9752e2e516a1d643b966f Apr 22 19:17:42.095723 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.095693 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c"] Apr 22 19:17:42.098833 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.098818 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" Apr 22 19:17:42.107831 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.107808 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c"] Apr 22 19:17:42.186733 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.186708 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpxzz\" (UniqueName: \"kubernetes.io/projected/367c1472-8cda-4c0c-9f69-b79028800176-kube-api-access-dpxzz\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c\" (UID: \"367c1472-8cda-4c0c-9f69-b79028800176\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" Apr 22 19:17:42.186839 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.186742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/367c1472-8cda-4c0c-9f69-b79028800176-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c\" (UID: \"367c1472-8cda-4c0c-9f69-b79028800176\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" Apr 22 19:17:42.186839 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.186806 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/367c1472-8cda-4c0c-9f69-b79028800176-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c\" (UID: \"367c1472-8cda-4c0c-9f69-b79028800176\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" Apr 22 19:17:42.287368 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.287335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpxzz\" (UniqueName: \"kubernetes.io/projected/367c1472-8cda-4c0c-9f69-b79028800176-kube-api-access-dpxzz\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c\" (UID: \"367c1472-8cda-4c0c-9f69-b79028800176\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" Apr 22 19:17:42.287496 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.287377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/367c1472-8cda-4c0c-9f69-b79028800176-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c\" (UID: \"367c1472-8cda-4c0c-9f69-b79028800176\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" Apr 22 19:17:42.287496 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.287442 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/367c1472-8cda-4c0c-9f69-b79028800176-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c\" (UID: \"367c1472-8cda-4c0c-9f69-b79028800176\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" Apr 22 19:17:42.287804 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.287784 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/367c1472-8cda-4c0c-9f69-b79028800176-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c\" (UID: \"367c1472-8cda-4c0c-9f69-b79028800176\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" Apr 22 19:17:42.287873 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.287819 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/367c1472-8cda-4c0c-9f69-b79028800176-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c\" (UID: \"367c1472-8cda-4c0c-9f69-b79028800176\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" Apr 22 19:17:42.296434 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.296403 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpxzz\" (UniqueName: \"kubernetes.io/projected/367c1472-8cda-4c0c-9f69-b79028800176-kube-api-access-dpxzz\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c\" (UID: \"367c1472-8cda-4c0c-9f69-b79028800176\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" Apr 22 19:17:42.408156 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.408085 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" Apr 22 19:17:42.429402 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.429372 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g"] Apr 22 19:17:42.432925 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.432901 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" Apr 22 19:17:42.442456 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.442422 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g"] Apr 22 19:17:42.489691 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.489538 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/021e828f-b756-4c70-91a0-53be754f90be-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g\" (UID: \"021e828f-b756-4c70-91a0-53be754f90be\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" Apr 22 19:17:42.489691 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.489666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/021e828f-b756-4c70-91a0-53be754f90be-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g\" (UID: \"021e828f-b756-4c70-91a0-53be754f90be\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" Apr 22 19:17:42.489932 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.489700 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcgfr\" (UniqueName: \"kubernetes.io/projected/021e828f-b756-4c70-91a0-53be754f90be-kube-api-access-qcgfr\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g\" (UID: \"021e828f-b756-4c70-91a0-53be754f90be\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" Apr 22 19:17:42.540052 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.536818 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c"] Apr 22 19:17:42.590482 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.590460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/021e828f-b756-4c70-91a0-53be754f90be-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g\" (UID: \"021e828f-b756-4c70-91a0-53be754f90be\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" Apr 22 19:17:42.590593 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.590576 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/021e828f-b756-4c70-91a0-53be754f90be-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g\" (UID: \"021e828f-b756-4c70-91a0-53be754f90be\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" Apr 22 19:17:42.590650 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.590612 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcgfr\" (UniqueName: \"kubernetes.io/projected/021e828f-b756-4c70-91a0-53be754f90be-kube-api-access-qcgfr\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g\" (UID: \"021e828f-b756-4c70-91a0-53be754f90be\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" Apr 22 19:17:42.590856 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.590837 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/021e828f-b756-4c70-91a0-53be754f90be-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g\" (UID: \"021e828f-b756-4c70-91a0-53be754f90be\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" Apr 22 19:17:42.590914 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.590895 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/021e828f-b756-4c70-91a0-53be754f90be-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g\" (UID: \"021e828f-b756-4c70-91a0-53be754f90be\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" Apr 22 19:17:42.599789 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.599764 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcgfr\" (UniqueName: \"kubernetes.io/projected/021e828f-b756-4c70-91a0-53be754f90be-kube-api-access-qcgfr\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g\" (UID: \"021e828f-b756-4c70-91a0-53be754f90be\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" Apr 22 19:17:42.746802 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.746774 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" Apr 22 19:17:42.823375 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.823338 2576 generic.go:358] "Generic (PLEG): container finished" podID="36838a31-9166-433f-a56b-d65afe7fccc0" containerID="925d8c55233657312d98be6f552403633c2e173323367d4fc429707d727724bb" exitCode=0 Apr 22 19:17:42.823495 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.823462 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" event={"ID":"36838a31-9166-433f-a56b-d65afe7fccc0","Type":"ContainerDied","Data":"925d8c55233657312d98be6f552403633c2e173323367d4fc429707d727724bb"} Apr 22 19:17:42.823574 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.823492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" event={"ID":"36838a31-9166-433f-a56b-d65afe7fccc0","Type":"ContainerStarted","Data":"8113993717aebc3377bfc82aaa25c5621ec52ad3f5d9752e2e516a1d643b966f"} Apr 22 19:17:42.827087 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.827063 2576 generic.go:358] "Generic (PLEG): container finished" podID="dcea311f-afff-4b14-874c-c48c8cfda339" containerID="9282bc4f281698aa2ec8f479f15c02fe90e949ca2d13f4ad3272de258b588fae" exitCode=0 Apr 22 19:17:42.827185 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.827137 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" event={"ID":"dcea311f-afff-4b14-874c-c48c8cfda339","Type":"ContainerDied","Data":"9282bc4f281698aa2ec8f479f15c02fe90e949ca2d13f4ad3272de258b588fae"} Apr 22 19:17:42.828784 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.828760 2576 generic.go:358] "Generic (PLEG): container finished" podID="367c1472-8cda-4c0c-9f69-b79028800176" containerID="f372d50461c4a21f8145ff973d65fb6b7f8d2d22710750c512e45aa55587487f" exitCode=0 Apr 22 19:17:42.828888 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.828844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" event={"ID":"367c1472-8cda-4c0c-9f69-b79028800176","Type":"ContainerDied","Data":"f372d50461c4a21f8145ff973d65fb6b7f8d2d22710750c512e45aa55587487f"} Apr 22 19:17:42.828888 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.828879 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" event={"ID":"367c1472-8cda-4c0c-9f69-b79028800176","Type":"ContainerStarted","Data":"247dbe30cc20aaef6e274e9ee098123e72ff40ca9a3cb9e2c80303eaa09da2f5"} Apr 22 19:17:42.876699 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:42.876679 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g"] Apr 22 19:17:42.878869 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:17:42.878841 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod021e828f_b756_4c70_91a0_53be754f90be.slice/crio-162c57bc0ef0f43fae28b3482b6a55ecab4032897eb5cc21b363423cafb927eb WatchSource:0}: Error finding container 162c57bc0ef0f43fae28b3482b6a55ecab4032897eb5cc21b363423cafb927eb: Status 404 returned error can't find the container with id 162c57bc0ef0f43fae28b3482b6a55ecab4032897eb5cc21b363423cafb927eb Apr 22 19:17:43.834628 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:43.834595 2576 generic.go:358] "Generic (PLEG): container finished" podID="021e828f-b756-4c70-91a0-53be754f90be" containerID="6e7fb8c68a755c7f74d11ec587ba6297077d2d29ff76dc8d5cb4d0a35fd0935c" exitCode=0 Apr 22 19:17:43.834975 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:43.834673 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" event={"ID":"021e828f-b756-4c70-91a0-53be754f90be","Type":"ContainerDied","Data":"6e7fb8c68a755c7f74d11ec587ba6297077d2d29ff76dc8d5cb4d0a35fd0935c"} Apr 22 19:17:43.834975 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:43.834705 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" event={"ID":"021e828f-b756-4c70-91a0-53be754f90be","Type":"ContainerStarted","Data":"162c57bc0ef0f43fae28b3482b6a55ecab4032897eb5cc21b363423cafb927eb"} Apr 22 19:17:43.836377 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:43.836352 2576 generic.go:358] "Generic (PLEG): container finished" podID="36838a31-9166-433f-a56b-d65afe7fccc0" containerID="ffe60e534f9718e0e6e1581f57ef25732404d77a81c43a031c998d21901fdcaa" exitCode=0 Apr 22 19:17:43.836482 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:43.836431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" event={"ID":"36838a31-9166-433f-a56b-d65afe7fccc0","Type":"ContainerDied","Data":"ffe60e534f9718e0e6e1581f57ef25732404d77a81c43a031c998d21901fdcaa"} Apr 22 19:17:43.839039 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:43.838987 2576 generic.go:358] "Generic (PLEG): container finished" podID="dcea311f-afff-4b14-874c-c48c8cfda339" containerID="7cebc8f046888a28b7f735a62225ed2c0311f27e1197f2a46e82fd860a7d4ed6" exitCode=0 Apr 22 19:17:43.839120 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:43.839045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" event={"ID":"dcea311f-afff-4b14-874c-c48c8cfda339","Type":"ContainerDied","Data":"7cebc8f046888a28b7f735a62225ed2c0311f27e1197f2a46e82fd860a7d4ed6"} Apr 22 19:17:43.840657 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:43.840635 2576 generic.go:358] "Generic (PLEG): container finished" podID="367c1472-8cda-4c0c-9f69-b79028800176" containerID="36747b76a9b308c0529627cc9a9499f4eb0cf9b262265fb66d5f86ce86096acb" exitCode=0 Apr 22 19:17:43.840741 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:43.840702 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" event={"ID":"367c1472-8cda-4c0c-9f69-b79028800176","Type":"ContainerDied","Data":"36747b76a9b308c0529627cc9a9499f4eb0cf9b262265fb66d5f86ce86096acb"} Apr 22 19:17:44.851088 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:44.851052 2576 generic.go:358] "Generic (PLEG): container finished" podID="367c1472-8cda-4c0c-9f69-b79028800176" containerID="cd2ce628268da5deb2ebc024bb6bcb5b7ac94093616976f02be6a1883096f5a2" exitCode=0 Apr 22 19:17:44.851456 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:44.851097 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" event={"ID":"367c1472-8cda-4c0c-9f69-b79028800176","Type":"ContainerDied","Data":"cd2ce628268da5deb2ebc024bb6bcb5b7ac94093616976f02be6a1883096f5a2"} Apr 22 19:17:44.852755 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:44.852728 2576 generic.go:358] "Generic (PLEG): container finished" podID="021e828f-b756-4c70-91a0-53be754f90be" containerID="1cee054817c6aefb96e1abf5f06313ba25bac51212c2cb9717ee9fcc54932e39" exitCode=0 Apr 22 19:17:44.852875 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:44.852815 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" event={"ID":"021e828f-b756-4c70-91a0-53be754f90be","Type":"ContainerDied","Data":"1cee054817c6aefb96e1abf5f06313ba25bac51212c2cb9717ee9fcc54932e39"} Apr 22 19:17:44.854794 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:44.854767 2576 generic.go:358] "Generic (PLEG): container finished" podID="36838a31-9166-433f-a56b-d65afe7fccc0" containerID="1ef86b4adc20619ce726c0e5bd7394ee0133c772e567429dc1211370c4594bd9" exitCode=0 Apr 22 19:17:44.855004 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:44.854850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" event={"ID":"36838a31-9166-433f-a56b-d65afe7fccc0","Type":"ContainerDied","Data":"1ef86b4adc20619ce726c0e5bd7394ee0133c772e567429dc1211370c4594bd9"} Apr 22 19:17:44.992455 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:44.992432 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" Apr 22 19:17:45.108969 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:45.108905 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcea311f-afff-4b14-874c-c48c8cfda339-util\") pod \"dcea311f-afff-4b14-874c-c48c8cfda339\" (UID: \"dcea311f-afff-4b14-874c-c48c8cfda339\") " Apr 22 19:17:45.108969 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:45.108953 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcea311f-afff-4b14-874c-c48c8cfda339-bundle\") pod \"dcea311f-afff-4b14-874c-c48c8cfda339\" (UID: \"dcea311f-afff-4b14-874c-c48c8cfda339\") " Apr 22 19:17:45.109207 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:45.108983 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rdvl\" (UniqueName: \"kubernetes.io/projected/dcea311f-afff-4b14-874c-c48c8cfda339-kube-api-access-5rdvl\") pod \"dcea311f-afff-4b14-874c-c48c8cfda339\" (UID: \"dcea311f-afff-4b14-874c-c48c8cfda339\") " Apr 22 19:17:45.109393 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:45.109368 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcea311f-afff-4b14-874c-c48c8cfda339-bundle" (OuterVolumeSpecName: "bundle") pod "dcea311f-afff-4b14-874c-c48c8cfda339" (UID: "dcea311f-afff-4b14-874c-c48c8cfda339"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:17:45.110966 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:45.110941 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcea311f-afff-4b14-874c-c48c8cfda339-kube-api-access-5rdvl" (OuterVolumeSpecName: "kube-api-access-5rdvl") pod "dcea311f-afff-4b14-874c-c48c8cfda339" (UID: "dcea311f-afff-4b14-874c-c48c8cfda339"). InnerVolumeSpecName "kube-api-access-5rdvl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:17:45.114267 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:45.114231 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcea311f-afff-4b14-874c-c48c8cfda339-util" (OuterVolumeSpecName: "util") pod "dcea311f-afff-4b14-874c-c48c8cfda339" (UID: "dcea311f-afff-4b14-874c-c48c8cfda339"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:17:45.210055 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:45.210027 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcea311f-afff-4b14-874c-c48c8cfda339-util\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:17:45.210055 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:45.210051 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcea311f-afff-4b14-874c-c48c8cfda339-bundle\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:17:45.210194 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:45.210065 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5rdvl\" (UniqueName: \"kubernetes.io/projected/dcea311f-afff-4b14-874c-c48c8cfda339-kube-api-access-5rdvl\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:17:45.859340 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:45.859309 2576 generic.go:358] "Generic (PLEG): container finished" podID="021e828f-b756-4c70-91a0-53be754f90be" containerID="bc6bac143a569c56ad771c2f8c9b67f934b10d85a1bfa403b8b5e738f2e6f29b" exitCode=0 Apr 22 19:17:45.859725 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:45.859395 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" event={"ID":"021e828f-b756-4c70-91a0-53be754f90be","Type":"ContainerDied","Data":"bc6bac143a569c56ad771c2f8c9b67f934b10d85a1bfa403b8b5e738f2e6f29b"} Apr 22 19:17:45.860975 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:45.860961 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" Apr 22 19:17:45.861065 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:45.860988 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w" event={"ID":"dcea311f-afff-4b14-874c-c48c8cfda339","Type":"ContainerDied","Data":"5f1463f79e0193199f316047615fb7b3958029dcc358549411c82dc83ef6f618"} Apr 22 19:17:45.861065 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:45.861014 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f1463f79e0193199f316047615fb7b3958029dcc358549411c82dc83ef6f618" Apr 22 19:17:45.983785 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:45.983765 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" Apr 22 19:17:46.010775 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.010754 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" Apr 22 19:17:46.117175 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.117114 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36838a31-9166-433f-a56b-d65afe7fccc0-bundle\") pod \"36838a31-9166-433f-a56b-d65afe7fccc0\" (UID: \"36838a31-9166-433f-a56b-d65afe7fccc0\") " Apr 22 19:17:46.117175 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.117144 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/367c1472-8cda-4c0c-9f69-b79028800176-bundle\") pod \"367c1472-8cda-4c0c-9f69-b79028800176\" (UID: \"367c1472-8cda-4c0c-9f69-b79028800176\") " Apr 22 19:17:46.117175 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.117164 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpxzz\" (UniqueName: \"kubernetes.io/projected/367c1472-8cda-4c0c-9f69-b79028800176-kube-api-access-dpxzz\") pod \"367c1472-8cda-4c0c-9f69-b79028800176\" (UID: \"367c1472-8cda-4c0c-9f69-b79028800176\") " Apr 22 19:17:46.117387 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.117218 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64kqr\" (UniqueName: \"kubernetes.io/projected/36838a31-9166-433f-a56b-d65afe7fccc0-kube-api-access-64kqr\") pod \"36838a31-9166-433f-a56b-d65afe7fccc0\" (UID: \"36838a31-9166-433f-a56b-d65afe7fccc0\") " Apr 22 19:17:46.117387 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.117236 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/367c1472-8cda-4c0c-9f69-b79028800176-util\") pod \"367c1472-8cda-4c0c-9f69-b79028800176\" (UID: \"367c1472-8cda-4c0c-9f69-b79028800176\") " Apr 22 19:17:46.117387 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.117255 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36838a31-9166-433f-a56b-d65afe7fccc0-util\") pod \"36838a31-9166-433f-a56b-d65afe7fccc0\" (UID: \"36838a31-9166-433f-a56b-d65afe7fccc0\") " Apr 22 19:17:46.117830 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.117801 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/367c1472-8cda-4c0c-9f69-b79028800176-bundle" (OuterVolumeSpecName: "bundle") pod "367c1472-8cda-4c0c-9f69-b79028800176" (UID: "367c1472-8cda-4c0c-9f69-b79028800176"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:17:46.118012 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.117983 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36838a31-9166-433f-a56b-d65afe7fccc0-bundle" (OuterVolumeSpecName: "bundle") pod "36838a31-9166-433f-a56b-d65afe7fccc0" (UID: "36838a31-9166-433f-a56b-d65afe7fccc0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:17:46.119447 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.119424 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/367c1472-8cda-4c0c-9f69-b79028800176-kube-api-access-dpxzz" (OuterVolumeSpecName: "kube-api-access-dpxzz") pod "367c1472-8cda-4c0c-9f69-b79028800176" (UID: "367c1472-8cda-4c0c-9f69-b79028800176"). InnerVolumeSpecName "kube-api-access-dpxzz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:17:46.119678 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.119653 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36838a31-9166-433f-a56b-d65afe7fccc0-kube-api-access-64kqr" (OuterVolumeSpecName: "kube-api-access-64kqr") pod "36838a31-9166-433f-a56b-d65afe7fccc0" (UID: "36838a31-9166-433f-a56b-d65afe7fccc0"). InnerVolumeSpecName "kube-api-access-64kqr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:17:46.122873 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.122853 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/367c1472-8cda-4c0c-9f69-b79028800176-util" (OuterVolumeSpecName: "util") pod "367c1472-8cda-4c0c-9f69-b79028800176" (UID: "367c1472-8cda-4c0c-9f69-b79028800176"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:17:46.122939 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.122909 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36838a31-9166-433f-a56b-d65afe7fccc0-util" (OuterVolumeSpecName: "util") pod "36838a31-9166-433f-a56b-d65afe7fccc0" (UID: "36838a31-9166-433f-a56b-d65afe7fccc0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:17:46.218049 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.218022 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-64kqr\" (UniqueName: \"kubernetes.io/projected/36838a31-9166-433f-a56b-d65afe7fccc0-kube-api-access-64kqr\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:17:46.218049 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.218045 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/367c1472-8cda-4c0c-9f69-b79028800176-util\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:17:46.218178 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.218055 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36838a31-9166-433f-a56b-d65afe7fccc0-util\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:17:46.218178 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.218064 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36838a31-9166-433f-a56b-d65afe7fccc0-bundle\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:17:46.218178 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.218071 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/367c1472-8cda-4c0c-9f69-b79028800176-bundle\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:17:46.218178 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.218079 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dpxzz\" (UniqueName: \"kubernetes.io/projected/367c1472-8cda-4c0c-9f69-b79028800176-kube-api-access-dpxzz\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:17:46.866026 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.865999 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" Apr 22 19:17:46.866396 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.865998 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl" event={"ID":"36838a31-9166-433f-a56b-d65afe7fccc0","Type":"ContainerDied","Data":"8113993717aebc3377bfc82aaa25c5621ec52ad3f5d9752e2e516a1d643b966f"} Apr 22 19:17:46.866396 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.866099 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8113993717aebc3377bfc82aaa25c5621ec52ad3f5d9752e2e516a1d643b966f" Apr 22 19:17:46.867797 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.867780 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" Apr 22 19:17:46.867891 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.867833 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c" event={"ID":"367c1472-8cda-4c0c-9f69-b79028800176","Type":"ContainerDied","Data":"247dbe30cc20aaef6e274e9ee098123e72ff40ca9a3cb9e2c80303eaa09da2f5"} Apr 22 19:17:46.867891 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.867857 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="247dbe30cc20aaef6e274e9ee098123e72ff40ca9a3cb9e2c80303eaa09da2f5" Apr 22 19:17:46.985732 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:46.985704 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" Apr 22 19:17:47.125745 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:47.125679 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcgfr\" (UniqueName: \"kubernetes.io/projected/021e828f-b756-4c70-91a0-53be754f90be-kube-api-access-qcgfr\") pod \"021e828f-b756-4c70-91a0-53be754f90be\" (UID: \"021e828f-b756-4c70-91a0-53be754f90be\") " Apr 22 19:17:47.125861 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:47.125743 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/021e828f-b756-4c70-91a0-53be754f90be-util\") pod \"021e828f-b756-4c70-91a0-53be754f90be\" (UID: \"021e828f-b756-4c70-91a0-53be754f90be\") " Apr 22 19:17:47.125861 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:47.125800 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/021e828f-b756-4c70-91a0-53be754f90be-bundle\") pod \"021e828f-b756-4c70-91a0-53be754f90be\" (UID: \"021e828f-b756-4c70-91a0-53be754f90be\") " Apr 22 19:17:47.126242 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:47.126206 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/021e828f-b756-4c70-91a0-53be754f90be-bundle" (OuterVolumeSpecName: "bundle") pod "021e828f-b756-4c70-91a0-53be754f90be" (UID: "021e828f-b756-4c70-91a0-53be754f90be"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:17:47.127690 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:47.127666 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/021e828f-b756-4c70-91a0-53be754f90be-kube-api-access-qcgfr" (OuterVolumeSpecName: "kube-api-access-qcgfr") pod "021e828f-b756-4c70-91a0-53be754f90be" (UID: "021e828f-b756-4c70-91a0-53be754f90be"). InnerVolumeSpecName "kube-api-access-qcgfr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:17:47.133176 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:47.133150 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/021e828f-b756-4c70-91a0-53be754f90be-util" (OuterVolumeSpecName: "util") pod "021e828f-b756-4c70-91a0-53be754f90be" (UID: "021e828f-b756-4c70-91a0-53be754f90be"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:17:47.227027 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:47.226997 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/021e828f-b756-4c70-91a0-53be754f90be-util\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:17:47.227027 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:47.227020 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/021e828f-b756-4c70-91a0-53be754f90be-bundle\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:17:47.227156 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:47.227031 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qcgfr\" (UniqueName: \"kubernetes.io/projected/021e828f-b756-4c70-91a0-53be754f90be-kube-api-access-qcgfr\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:17:47.873050 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:47.873007 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" event={"ID":"021e828f-b756-4c70-91a0-53be754f90be","Type":"ContainerDied","Data":"162c57bc0ef0f43fae28b3482b6a55ecab4032897eb5cc21b363423cafb927eb"} Apr 22 19:17:47.873050 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:47.873045 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="162c57bc0ef0f43fae28b3482b6a55ecab4032897eb5cc21b363423cafb927eb" Apr 22 19:17:47.873434 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:47.873057 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g" Apr 22 19:17:57.300381 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300349 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m"] Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300611 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="367c1472-8cda-4c0c-9f69-b79028800176" containerName="util" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300624 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="367c1472-8cda-4c0c-9f69-b79028800176" containerName="util" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300631 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="021e828f-b756-4c70-91a0-53be754f90be" containerName="extract" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300637 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="021e828f-b756-4c70-91a0-53be754f90be" containerName="extract" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300644 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="367c1472-8cda-4c0c-9f69-b79028800176" containerName="pull" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300650 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="367c1472-8cda-4c0c-9f69-b79028800176" containerName="pull" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300659 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dcea311f-afff-4b14-874c-c48c8cfda339" containerName="util" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300664 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcea311f-afff-4b14-874c-c48c8cfda339" containerName="util" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300670 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dcea311f-afff-4b14-874c-c48c8cfda339" containerName="pull" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300676 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcea311f-afff-4b14-874c-c48c8cfda339" containerName="pull" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300686 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36838a31-9166-433f-a56b-d65afe7fccc0" containerName="util" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300692 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="36838a31-9166-433f-a56b-d65afe7fccc0" containerName="util" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300698 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="021e828f-b756-4c70-91a0-53be754f90be" containerName="util" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300704 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="021e828f-b756-4c70-91a0-53be754f90be" containerName="util" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300709 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="367c1472-8cda-4c0c-9f69-b79028800176" containerName="extract" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300714 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="367c1472-8cda-4c0c-9f69-b79028800176" containerName="extract" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300720 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dcea311f-afff-4b14-874c-c48c8cfda339" containerName="extract" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300725 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcea311f-afff-4b14-874c-c48c8cfda339" containerName="extract" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300730 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36838a31-9166-433f-a56b-d65afe7fccc0" containerName="pull" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300734 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="36838a31-9166-433f-a56b-d65afe7fccc0" containerName="pull" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300740 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="021e828f-b756-4c70-91a0-53be754f90be" containerName="pull" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300745 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="021e828f-b756-4c70-91a0-53be754f90be" containerName="pull" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300751 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36838a31-9166-433f-a56b-d65afe7fccc0" containerName="extract" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300756 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="36838a31-9166-433f-a56b-d65afe7fccc0" containerName="extract" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300799 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="dcea311f-afff-4b14-874c-c48c8cfda339" containerName="extract" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300806 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="36838a31-9166-433f-a56b-d65afe7fccc0" containerName="extract" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300813 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="021e828f-b756-4c70-91a0-53be754f90be" containerName="extract" Apr 22 19:17:57.300847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.300820 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="367c1472-8cda-4c0c-9f69-b79028800176" containerName="extract" Apr 22 19:17:57.304941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.304925 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m" Apr 22 19:17:57.308439 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.308419 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-wsn9b\"" Apr 22 19:17:57.322050 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.322027 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m"] Apr 22 19:17:57.407145 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.407118 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vjxt\" (UniqueName: \"kubernetes.io/projected/45f2f3f3-1169-47f6-95de-c7da41d80224-kube-api-access-7vjxt\") pod \"limitador-operator-controller-manager-85c4996f8c-5vf8m\" (UID: \"45f2f3f3-1169-47f6-95de-c7da41d80224\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m" Apr 22 19:17:57.507403 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.507367 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vjxt\" (UniqueName: \"kubernetes.io/projected/45f2f3f3-1169-47f6-95de-c7da41d80224-kube-api-access-7vjxt\") pod \"limitador-operator-controller-manager-85c4996f8c-5vf8m\" (UID: \"45f2f3f3-1169-47f6-95de-c7da41d80224\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m" Apr 22 19:17:57.516664 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.516635 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vjxt\" (UniqueName: \"kubernetes.io/projected/45f2f3f3-1169-47f6-95de-c7da41d80224-kube-api-access-7vjxt\") pod \"limitador-operator-controller-manager-85c4996f8c-5vf8m\" (UID: \"45f2f3f3-1169-47f6-95de-c7da41d80224\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m" Apr 22 19:17:57.614415 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.614355 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m" Apr 22 19:17:57.730761 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.730729 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m"] Apr 22 19:17:57.733785 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:17:57.733750 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45f2f3f3_1169_47f6_95de_c7da41d80224.slice/crio-1edc129d873b955079b34b6ae9fb91cb9bfb337c1a8b0f1fca9770f9fbf3364d WatchSource:0}: Error finding container 1edc129d873b955079b34b6ae9fb91cb9bfb337c1a8b0f1fca9770f9fbf3364d: Status 404 returned error can't find the container with id 1edc129d873b955079b34b6ae9fb91cb9bfb337c1a8b0f1fca9770f9fbf3364d Apr 22 19:17:57.905502 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:17:57.905432 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m" event={"ID":"45f2f3f3-1169-47f6-95de-c7da41d80224","Type":"ContainerStarted","Data":"1edc129d873b955079b34b6ae9fb91cb9bfb337c1a8b0f1fca9770f9fbf3364d"} Apr 22 19:18:00.918189 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:00.918153 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m" event={"ID":"45f2f3f3-1169-47f6-95de-c7da41d80224","Type":"ContainerStarted","Data":"5551422ecb25f832a7dcf916084062a2d0009b6980f50a60a2647ed8052ac238"} Apr 22 19:18:00.918587 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:00.918271 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m" Apr 22 19:18:00.948358 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:00.948313 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m" podStartSLOduration=1.404459264 podStartE2EDuration="3.948300841s" podCreationTimestamp="2026-04-22 19:17:57 +0000 UTC" firstStartedPulling="2026-04-22 19:17:57.738096673 +0000 UTC m=+702.418574511" lastFinishedPulling="2026-04-22 19:18:00.281938255 +0000 UTC m=+704.962416088" observedRunningTime="2026-04-22 19:18:00.9452455 +0000 UTC m=+705.625723355" watchObservedRunningTime="2026-04-22 19:18:00.948300841 +0000 UTC m=+705.628778696" Apr 22 19:18:05.224355 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:05.224274 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-77t92"] Apr 22 19:18:05.227387 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:05.227372 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-77t92" Apr 22 19:18:05.230850 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:05.230828 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-wfj8d\"" Apr 22 19:18:05.231249 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:05.231234 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 22 19:18:05.243525 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:05.243495 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-77t92"] Apr 22 19:18:05.369054 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:05.369028 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpktp\" (UniqueName: \"kubernetes.io/projected/64b14dcf-2441-4902-b01e-c729cb4558b5-kube-api-access-xpktp\") pod \"dns-operator-controller-manager-648d5c98bc-77t92\" (UID: \"64b14dcf-2441-4902-b01e-c729cb4558b5\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-77t92" Apr 22 19:18:05.470176 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:05.470145 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpktp\" (UniqueName: \"kubernetes.io/projected/64b14dcf-2441-4902-b01e-c729cb4558b5-kube-api-access-xpktp\") pod \"dns-operator-controller-manager-648d5c98bc-77t92\" (UID: \"64b14dcf-2441-4902-b01e-c729cb4558b5\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-77t92" Apr 22 19:18:05.484900 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:05.484838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpktp\" (UniqueName: \"kubernetes.io/projected/64b14dcf-2441-4902-b01e-c729cb4558b5-kube-api-access-xpktp\") pod \"dns-operator-controller-manager-648d5c98bc-77t92\" (UID: \"64b14dcf-2441-4902-b01e-c729cb4558b5\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-77t92" Apr 22 19:18:05.537811 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:05.537781 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-77t92" Apr 22 19:18:05.671942 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:18:05.671909 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64b14dcf_2441_4902_b01e_c729cb4558b5.slice/crio-947448e0063781b7d47c93bb829897799a86bebf14b28ee9508a663da00c7a26 WatchSource:0}: Error finding container 947448e0063781b7d47c93bb829897799a86bebf14b28ee9508a663da00c7a26: Status 404 returned error can't find the container with id 947448e0063781b7d47c93bb829897799a86bebf14b28ee9508a663da00c7a26 Apr 22 19:18:05.678582 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:05.674390 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-77t92"] Apr 22 19:18:05.936145 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:05.936114 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-77t92" event={"ID":"64b14dcf-2441-4902-b01e-c729cb4558b5","Type":"ContainerStarted","Data":"947448e0063781b7d47c93bb829897799a86bebf14b28ee9508a663da00c7a26"} Apr 22 19:18:08.946672 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:08.946586 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-77t92" event={"ID":"64b14dcf-2441-4902-b01e-c729cb4558b5","Type":"ContainerStarted","Data":"48bb030d3a9c40644801f4f64d9237892d9c3d5078a43c51c4bb3b5031039b47"} Apr 22 19:18:08.947083 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:08.946687 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-77t92" Apr 22 19:18:08.967827 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:08.967783 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-77t92" podStartSLOduration=0.990000989 podStartE2EDuration="3.967770199s" podCreationTimestamp="2026-04-22 19:18:05 +0000 UTC" firstStartedPulling="2026-04-22 19:18:05.674766403 +0000 UTC m=+710.355244254" lastFinishedPulling="2026-04-22 19:18:08.652535627 +0000 UTC m=+713.333013464" observedRunningTime="2026-04-22 19:18:08.966852989 +0000 UTC m=+713.647330846" watchObservedRunningTime="2026-04-22 19:18:08.967770199 +0000 UTC m=+713.648248053" Apr 22 19:18:10.019439 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:10.019409 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2"] Apr 22 19:18:10.022628 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:10.022606 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" Apr 22 19:18:10.025393 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:10.025368 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-2f9z4\"" Apr 22 19:18:10.036765 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:10.036737 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2"] Apr 22 19:18:10.104061 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:10.104023 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr896\" (UniqueName: \"kubernetes.io/projected/51024a3a-0693-4712-8cb5-8834a421b1cd-kube-api-access-lr896\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5snt2\" (UID: \"51024a3a-0693-4712-8cb5-8834a421b1cd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" Apr 22 19:18:10.104231 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:10.104110 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/51024a3a-0693-4712-8cb5-8834a421b1cd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5snt2\" (UID: \"51024a3a-0693-4712-8cb5-8834a421b1cd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" Apr 22 19:18:10.205501 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:10.205466 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lr896\" (UniqueName: \"kubernetes.io/projected/51024a3a-0693-4712-8cb5-8834a421b1cd-kube-api-access-lr896\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5snt2\" (UID: \"51024a3a-0693-4712-8cb5-8834a421b1cd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" Apr 22 19:18:10.205691 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:10.205517 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/51024a3a-0693-4712-8cb5-8834a421b1cd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5snt2\" (UID: \"51024a3a-0693-4712-8cb5-8834a421b1cd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" Apr 22 19:18:10.205899 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:10.205882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/51024a3a-0693-4712-8cb5-8834a421b1cd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5snt2\" (UID: \"51024a3a-0693-4712-8cb5-8834a421b1cd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" Apr 22 19:18:10.229421 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:10.229385 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr896\" (UniqueName: \"kubernetes.io/projected/51024a3a-0693-4712-8cb5-8834a421b1cd-kube-api-access-lr896\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5snt2\" (UID: \"51024a3a-0693-4712-8cb5-8834a421b1cd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" Apr 22 19:18:10.331881 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:10.331784 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" Apr 22 19:18:10.470452 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:10.470421 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2"] Apr 22 19:18:10.473434 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:18:10.473405 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51024a3a_0693_4712_8cb5_8834a421b1cd.slice/crio-87a26e672ea0018bd382d90bb07f00d63c625fd1db87290ed7394c44578f48af WatchSource:0}: Error finding container 87a26e672ea0018bd382d90bb07f00d63c625fd1db87290ed7394c44578f48af: Status 404 returned error can't find the container with id 87a26e672ea0018bd382d90bb07f00d63c625fd1db87290ed7394c44578f48af Apr 22 19:18:10.954806 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:10.954766 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" event={"ID":"51024a3a-0693-4712-8cb5-8834a421b1cd","Type":"ContainerStarted","Data":"87a26e672ea0018bd382d90bb07f00d63c625fd1db87290ed7394c44578f48af"} Apr 22 19:18:11.923931 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:11.923905 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m" Apr 22 19:18:16.983439 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:16.983395 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" event={"ID":"51024a3a-0693-4712-8cb5-8834a421b1cd","Type":"ContainerStarted","Data":"358c4bcca8e18ca99d0fa82fa0215ceabccc41a5477a4a515f4f4a32fdef8731"} Apr 22 19:18:16.983798 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:16.983526 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" Apr 22 19:18:17.018864 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:17.018818 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" podStartSLOduration=0.653800099 podStartE2EDuration="7.018804986s" podCreationTimestamp="2026-04-22 19:18:10 +0000 UTC" firstStartedPulling="2026-04-22 19:18:10.476381591 +0000 UTC m=+715.156859429" lastFinishedPulling="2026-04-22 19:18:16.841386483 +0000 UTC m=+721.521864316" observedRunningTime="2026-04-22 19:18:17.017407951 +0000 UTC m=+721.697885807" watchObservedRunningTime="2026-04-22 19:18:17.018804986 +0000 UTC m=+721.699282841" Apr 22 19:18:19.951654 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:19.951626 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-77t92" Apr 22 19:18:27.989371 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:27.989339 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" Apr 22 19:18:29.610023 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.609990 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb"] Apr 22 19:18:29.613419 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.613399 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb" Apr 22 19:18:29.634998 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.634971 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb"] Apr 22 19:18:29.635165 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:18:29.635148 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[extensions-socket-volume kube-api-access-th4m6], unattached volumes=[], failed to process volumes=[extensions-socket-volume kube-api-access-th4m6]: context canceled" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb" podUID="378d9b95-090d-417c-ad60-a8e211b026da" Apr 22 19:18:29.643872 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.643854 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb"] Apr 22 19:18:29.650310 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.650275 2576 status_manager.go:919] "Failed to update status for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378d9b95-090d-417c-ad60-a8e211b026da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-22T19:18:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-22T19:18:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-22T19:18:29Z\\\",\\\"message\\\":\\\"containers with unready status: [manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-22T19:18:29Z\\\",\\\"message\\\":\\\"containers with unready status: [manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/kuadrant/kuadrant-operator:v1.4.2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/kuadrant\\\",\\\"name\\\":\\\"extensions-socket-volume\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-th4m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"10.0.141.191\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"10.0.141.191\\\"}],\\\"startTime\\\":\\\"2026-04-22T19:18:29Z\\\"}}\" for pod \"kuadrant-system\"/\"kuadrant-operator-controller-manager-55c7f4c975-zvlpb\": pods \"kuadrant-operator-controller-manager-55c7f4c975-zvlpb\" not found" Apr 22 19:18:29.658579 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.658539 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2"] Apr 22 19:18:29.658802 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.658764 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" podUID="51024a3a-0693-4712-8cb5-8834a421b1cd" containerName="manager" containerID="cri-o://358c4bcca8e18ca99d0fa82fa0215ceabccc41a5477a4a515f4f4a32fdef8731" gracePeriod=2 Apr 22 19:18:29.660867 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.660843 2576 status_manager.go:895] "Failed to get status for pod" podUID="378d9b95-090d-417c-ad60-a8e211b026da" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-zvlpb\" is forbidden: User \"system:node:ip-10-0-141-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-191.ec2.internal' and this object" Apr 22 19:18:29.671752 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.671730 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2"] Apr 22 19:18:29.676954 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.676934 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-m5lrk"] Apr 22 19:18:29.677203 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.677192 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51024a3a-0693-4712-8cb5-8834a421b1cd" containerName="manager" Apr 22 19:18:29.677242 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.677205 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="51024a3a-0693-4712-8cb5-8834a421b1cd" containerName="manager" Apr 22 19:18:29.677275 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.677266 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="51024a3a-0693-4712-8cb5-8834a421b1cd" containerName="manager" Apr 22 19:18:29.679996 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.679980 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-m5lrk" Apr 22 19:18:29.682628 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.682600 2576 status_manager.go:895] "Failed to get status for pod" podUID="378d9b95-090d-417c-ad60-a8e211b026da" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-zvlpb\" is forbidden: User \"system:node:ip-10-0-141-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-191.ec2.internal' and this object" Apr 22 19:18:29.685307 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.685280 2576 status_manager.go:895] "Failed to get status for pod" podUID="51024a3a-0693-4712-8cb5-8834a421b1cd" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5snt2\" is forbidden: User \"system:node:ip-10-0-141-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-191.ec2.internal' and this object" Apr 22 19:18:29.706328 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.706236 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz"] Apr 22 19:18:29.709423 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.709403 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-m5lrk"] Apr 22 19:18:29.709523 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.709511 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" Apr 22 19:18:29.710790 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.710775 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m"] Apr 22 19:18:29.710977 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.710959 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m" podUID="45f2f3f3-1169-47f6-95de-c7da41d80224" containerName="manager" containerID="cri-o://5551422ecb25f832a7dcf916084062a2d0009b6980f50a60a2647ed8052ac238" gracePeriod=2 Apr 22 19:18:29.718644 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.718621 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m"] Apr 22 19:18:29.723861 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.723838 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz"] Apr 22 19:18:29.732576 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.732533 2576 status_manager.go:895] "Failed to get status for pod" podUID="378d9b95-090d-417c-ad60-a8e211b026da" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-zvlpb\" is forbidden: User \"system:node:ip-10-0-141-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-191.ec2.internal' and this object" Apr 22 19:18:29.735626 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.735604 2576 status_manager.go:895] "Failed to get status for pod" podUID="51024a3a-0693-4712-8cb5-8834a421b1cd" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5snt2\" is forbidden: User \"system:node:ip-10-0-141-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-191.ec2.internal' and this object" Apr 22 19:18:29.748844 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.748820 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2nwh\" (UniqueName: \"kubernetes.io/projected/5c2d426d-3561-4560-8644-9417aff439ba-kube-api-access-g2nwh\") pod \"kuadrant-operator-controller-manager-55c7f4c975-m5lrk\" (UID: \"5c2d426d-3561-4560-8644-9417aff439ba\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-m5lrk" Apr 22 19:18:29.748951 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.748892 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5c2d426d-3561-4560-8644-9417aff439ba-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-m5lrk\" (UID: \"5c2d426d-3561-4560-8644-9417aff439ba\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-m5lrk" Apr 22 19:18:29.797159 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.797130 2576 status_manager.go:895] "Failed to get status for pod" podUID="45f2f3f3-1169-47f6-95de-c7da41d80224" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m" err="pods \"limitador-operator-controller-manager-85c4996f8c-5vf8m\" is forbidden: User \"system:node:ip-10-0-141-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-191.ec2.internal' and this object" Apr 22 19:18:29.851771 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.851528 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5c2d426d-3561-4560-8644-9417aff439ba-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-m5lrk\" (UID: \"5c2d426d-3561-4560-8644-9417aff439ba\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-m5lrk" Apr 22 19:18:29.851771 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.851607 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm95x\" (UniqueName: \"kubernetes.io/projected/06e482a1-3926-4ec3-8cae-5b0148204486-kube-api-access-bm95x\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gz7zz\" (UID: \"06e482a1-3926-4ec3-8cae-5b0148204486\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" Apr 22 19:18:29.851771 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.851691 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2nwh\" (UniqueName: \"kubernetes.io/projected/5c2d426d-3561-4560-8644-9417aff439ba-kube-api-access-g2nwh\") pod \"kuadrant-operator-controller-manager-55c7f4c975-m5lrk\" (UID: \"5c2d426d-3561-4560-8644-9417aff439ba\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-m5lrk" Apr 22 19:18:29.851771 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.851721 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/06e482a1-3926-4ec3-8cae-5b0148204486-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gz7zz\" (UID: \"06e482a1-3926-4ec3-8cae-5b0148204486\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" Apr 22 19:18:29.852156 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.852129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5c2d426d-3561-4560-8644-9417aff439ba-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-m5lrk\" (UID: \"5c2d426d-3561-4560-8644-9417aff439ba\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-m5lrk" Apr 22 19:18:29.870445 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.870391 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2nwh\" (UniqueName: \"kubernetes.io/projected/5c2d426d-3561-4560-8644-9417aff439ba-kube-api-access-g2nwh\") pod \"kuadrant-operator-controller-manager-55c7f4c975-m5lrk\" (UID: \"5c2d426d-3561-4560-8644-9417aff439ba\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-m5lrk" Apr 22 19:18:29.887205 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.887175 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="378d9b95-090d-417c-ad60-a8e211b026da" path="/var/lib/kubelet/pods/378d9b95-090d-417c-ad60-a8e211b026da/volumes" Apr 22 19:18:29.930784 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.930761 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" Apr 22 19:18:29.933717 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.933693 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m" Apr 22 19:18:29.952648 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.952629 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vjxt\" (UniqueName: \"kubernetes.io/projected/45f2f3f3-1169-47f6-95de-c7da41d80224-kube-api-access-7vjxt\") pod \"45f2f3f3-1169-47f6-95de-c7da41d80224\" (UID: \"45f2f3f3-1169-47f6-95de-c7da41d80224\") " Apr 22 19:18:29.952727 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.952661 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr896\" (UniqueName: \"kubernetes.io/projected/51024a3a-0693-4712-8cb5-8834a421b1cd-kube-api-access-lr896\") pod \"51024a3a-0693-4712-8cb5-8834a421b1cd\" (UID: \"51024a3a-0693-4712-8cb5-8834a421b1cd\") " Apr 22 19:18:29.952727 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.952709 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/51024a3a-0693-4712-8cb5-8834a421b1cd-extensions-socket-volume\") pod \"51024a3a-0693-4712-8cb5-8834a421b1cd\" (UID: \"51024a3a-0693-4712-8cb5-8834a421b1cd\") " Apr 22 19:18:29.952840 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.952797 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/06e482a1-3926-4ec3-8cae-5b0148204486-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gz7zz\" (UID: \"06e482a1-3926-4ec3-8cae-5b0148204486\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" Apr 22 19:18:29.952901 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.952856 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bm95x\" (UniqueName: \"kubernetes.io/projected/06e482a1-3926-4ec3-8cae-5b0148204486-kube-api-access-bm95x\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gz7zz\" (UID: \"06e482a1-3926-4ec3-8cae-5b0148204486\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" Apr 22 19:18:29.953287 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.953229 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/06e482a1-3926-4ec3-8cae-5b0148204486-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gz7zz\" (UID: \"06e482a1-3926-4ec3-8cae-5b0148204486\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" Apr 22 19:18:29.953379 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.953307 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51024a3a-0693-4712-8cb5-8834a421b1cd-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "51024a3a-0693-4712-8cb5-8834a421b1cd" (UID: "51024a3a-0693-4712-8cb5-8834a421b1cd"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:18:29.954620 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.954598 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51024a3a-0693-4712-8cb5-8834a421b1cd-kube-api-access-lr896" (OuterVolumeSpecName: "kube-api-access-lr896") pod "51024a3a-0693-4712-8cb5-8834a421b1cd" (UID: "51024a3a-0693-4712-8cb5-8834a421b1cd"). InnerVolumeSpecName "kube-api-access-lr896". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:18:29.954681 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.954645 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f2f3f3-1169-47f6-95de-c7da41d80224-kube-api-access-7vjxt" (OuterVolumeSpecName: "kube-api-access-7vjxt") pod "45f2f3f3-1169-47f6-95de-c7da41d80224" (UID: "45f2f3f3-1169-47f6-95de-c7da41d80224"). InnerVolumeSpecName "kube-api-access-7vjxt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:18:29.971953 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:29.971932 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm95x\" (UniqueName: \"kubernetes.io/projected/06e482a1-3926-4ec3-8cae-5b0148204486-kube-api-access-bm95x\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gz7zz\" (UID: \"06e482a1-3926-4ec3-8cae-5b0148204486\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" Apr 22 19:18:30.030320 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.030299 2576 generic.go:358] "Generic (PLEG): container finished" podID="45f2f3f3-1169-47f6-95de-c7da41d80224" containerID="5551422ecb25f832a7dcf916084062a2d0009b6980f50a60a2647ed8052ac238" exitCode=0 Apr 22 19:18:30.030395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.030339 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5vf8m" Apr 22 19:18:30.030395 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.030377 2576 scope.go:117] "RemoveContainer" containerID="5551422ecb25f832a7dcf916084062a2d0009b6980f50a60a2647ed8052ac238" Apr 22 19:18:30.031791 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.031768 2576 generic.go:358] "Generic (PLEG): container finished" podID="51024a3a-0693-4712-8cb5-8834a421b1cd" containerID="358c4bcca8e18ca99d0fa82fa0215ceabccc41a5477a4a515f4f4a32fdef8731" exitCode=0 Apr 22 19:18:30.031879 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.031806 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5snt2" Apr 22 19:18:30.031991 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.031978 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb" Apr 22 19:18:30.034571 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.034521 2576 status_manager.go:895] "Failed to get status for pod" podUID="378d9b95-090d-417c-ad60-a8e211b026da" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-zvlpb\" is forbidden: User \"system:node:ip-10-0-141-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-191.ec2.internal' and this object" Apr 22 19:18:30.036645 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.036627 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb" Apr 22 19:18:30.038617 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.038590 2576 status_manager.go:895] "Failed to get status for pod" podUID="378d9b95-090d-417c-ad60-a8e211b026da" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-zvlpb\" is forbidden: User \"system:node:ip-10-0-141-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-191.ec2.internal' and this object" Apr 22 19:18:30.039378 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.039338 2576 scope.go:117] "RemoveContainer" containerID="5551422ecb25f832a7dcf916084062a2d0009b6980f50a60a2647ed8052ac238" Apr 22 19:18:30.039648 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:18:30.039626 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5551422ecb25f832a7dcf916084062a2d0009b6980f50a60a2647ed8052ac238\": container with ID starting with 5551422ecb25f832a7dcf916084062a2d0009b6980f50a60a2647ed8052ac238 not found: ID does not exist" containerID="5551422ecb25f832a7dcf916084062a2d0009b6980f50a60a2647ed8052ac238" Apr 22 19:18:30.039735 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.039656 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5551422ecb25f832a7dcf916084062a2d0009b6980f50a60a2647ed8052ac238"} err="failed to get container status \"5551422ecb25f832a7dcf916084062a2d0009b6980f50a60a2647ed8052ac238\": rpc error: code = NotFound desc = could not find container \"5551422ecb25f832a7dcf916084062a2d0009b6980f50a60a2647ed8052ac238\": container with ID starting with 5551422ecb25f832a7dcf916084062a2d0009b6980f50a60a2647ed8052ac238 not found: ID does not exist" Apr 22 19:18:30.039735 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.039678 2576 scope.go:117] "RemoveContainer" containerID="358c4bcca8e18ca99d0fa82fa0215ceabccc41a5477a4a515f4f4a32fdef8731" Apr 22 19:18:30.042463 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.042442 2576 status_manager.go:895] "Failed to get status for pod" podUID="378d9b95-090d-417c-ad60-a8e211b026da" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-zvlpb\" is forbidden: User \"system:node:ip-10-0-141-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-191.ec2.internal' and this object" Apr 22 19:18:30.044636 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.044615 2576 status_manager.go:895] "Failed to get status for pod" podUID="378d9b95-090d-417c-ad60-a8e211b026da" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-zvlpb\" is forbidden: User \"system:node:ip-10-0-141-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-191.ec2.internal' and this object" Apr 22 19:18:30.046940 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.046926 2576 scope.go:117] "RemoveContainer" containerID="358c4bcca8e18ca99d0fa82fa0215ceabccc41a5477a4a515f4f4a32fdef8731" Apr 22 19:18:30.047169 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:18:30.047149 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"358c4bcca8e18ca99d0fa82fa0215ceabccc41a5477a4a515f4f4a32fdef8731\": container with ID starting with 358c4bcca8e18ca99d0fa82fa0215ceabccc41a5477a4a515f4f4a32fdef8731 not found: ID does not exist" containerID="358c4bcca8e18ca99d0fa82fa0215ceabccc41a5477a4a515f4f4a32fdef8731" Apr 22 19:18:30.047225 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.047179 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"358c4bcca8e18ca99d0fa82fa0215ceabccc41a5477a4a515f4f4a32fdef8731"} err="failed to get container status \"358c4bcca8e18ca99d0fa82fa0215ceabccc41a5477a4a515f4f4a32fdef8731\": rpc error: code = NotFound desc = could not find container \"358c4bcca8e18ca99d0fa82fa0215ceabccc41a5477a4a515f4f4a32fdef8731\": container with ID starting with 358c4bcca8e18ca99d0fa82fa0215ceabccc41a5477a4a515f4f4a32fdef8731 not found: ID does not exist" Apr 22 19:18:30.053638 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.053619 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/51024a3a-0693-4712-8cb5-8834a421b1cd-extensions-socket-volume\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:18:30.053719 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.053641 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7vjxt\" (UniqueName: \"kubernetes.io/projected/45f2f3f3-1169-47f6-95de-c7da41d80224-kube-api-access-7vjxt\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:18:30.053719 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.053655 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lr896\" (UniqueName: \"kubernetes.io/projected/51024a3a-0693-4712-8cb5-8834a421b1cd-kube-api-access-lr896\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:18:30.075069 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.075053 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-m5lrk" Apr 22 19:18:30.081626 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.081609 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" Apr 22 19:18:30.213520 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.213497 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-m5lrk"] Apr 22 19:18:30.214136 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:18:30.214109 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c2d426d_3561_4560_8644_9417aff439ba.slice/crio-7d2cafa2f1feeec43441708372b3182fca35f8b604cb2d5141ebd4adeecc007c WatchSource:0}: Error finding container 7d2cafa2f1feeec43441708372b3182fca35f8b604cb2d5141ebd4adeecc007c: Status 404 returned error can't find the container with id 7d2cafa2f1feeec43441708372b3182fca35f8b604cb2d5141ebd4adeecc007c Apr 22 19:18:30.239787 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:30.239763 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz"] Apr 22 19:18:30.254668 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:18:30.254636 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06e482a1_3926_4ec3_8cae_5b0148204486.slice/crio-2a33e9a399fcdb8e1fbd1553184be580c775b0f5e5d329b57c8016129197c473 WatchSource:0}: Error finding container 2a33e9a399fcdb8e1fbd1553184be580c775b0f5e5d329b57c8016129197c473: Status 404 returned error can't find the container with id 2a33e9a399fcdb8e1fbd1553184be580c775b0f5e5d329b57c8016129197c473 Apr 22 19:18:31.037003 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:31.036966 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" event={"ID":"06e482a1-3926-4ec3-8cae-5b0148204486","Type":"ContainerStarted","Data":"d11a778c3d44dfc04839f182d65613edfd74a04623f4a3578513b54cea47bb6b"} Apr 22 19:18:31.037003 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:31.037002 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" event={"ID":"06e482a1-3926-4ec3-8cae-5b0148204486","Type":"ContainerStarted","Data":"2a33e9a399fcdb8e1fbd1553184be580c775b0f5e5d329b57c8016129197c473"} Apr 22 19:18:31.037492 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:31.037118 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" Apr 22 19:18:31.038451 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:31.038420 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-m5lrk" event={"ID":"5c2d426d-3561-4560-8644-9417aff439ba","Type":"ContainerStarted","Data":"acbe2b6e51a0b4e16401c281db9df1053d6aeeff68efe57ebe739527b903707a"} Apr 22 19:18:31.038451 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:31.038448 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-m5lrk" event={"ID":"5c2d426d-3561-4560-8644-9417aff439ba","Type":"ContainerStarted","Data":"7d2cafa2f1feeec43441708372b3182fca35f8b604cb2d5141ebd4adeecc007c"} Apr 22 19:18:31.038661 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:31.038564 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-m5lrk" Apr 22 19:18:31.040080 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:31.040062 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb" Apr 22 19:18:31.065932 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:31.065891 2576 status_manager.go:895] "Failed to get status for pod" podUID="378d9b95-090d-417c-ad60-a8e211b026da" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-zvlpb\" is forbidden: User \"system:node:ip-10-0-141-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-191.ec2.internal' and this object" Apr 22 19:18:31.067294 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:31.067255 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" podStartSLOduration=2.067242398 podStartE2EDuration="2.067242398s" podCreationTimestamp="2026-04-22 19:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:18:31.063613945 +0000 UTC m=+735.744091800" watchObservedRunningTime="2026-04-22 19:18:31.067242398 +0000 UTC m=+735.747720247" Apr 22 19:18:31.068869 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:31.068845 2576 status_manager.go:895] "Failed to get status for pod" podUID="378d9b95-090d-417c-ad60-a8e211b026da" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zvlpb" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-zvlpb\" is forbidden: User \"system:node:ip-10-0-141-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-191.ec2.internal' and this object" Apr 22 19:18:31.105368 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:31.105269 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-m5lrk" podStartSLOduration=2.105258645 podStartE2EDuration="2.105258645s" podCreationTimestamp="2026-04-22 19:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:18:31.104563079 +0000 UTC m=+735.785041006" watchObservedRunningTime="2026-04-22 19:18:31.105258645 +0000 UTC m=+735.785736499" Apr 22 19:18:31.884333 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:31.884299 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45f2f3f3-1169-47f6-95de-c7da41d80224" path="/var/lib/kubelet/pods/45f2f3f3-1169-47f6-95de-c7da41d80224/volumes" Apr 22 19:18:31.884669 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:31.884655 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51024a3a-0693-4712-8cb5-8834a421b1cd" path="/var/lib/kubelet/pods/51024a3a-0693-4712-8cb5-8834a421b1cd/volumes" Apr 22 19:18:42.046256 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:42.046228 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" Apr 22 19:18:42.046625 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:42.046288 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-m5lrk" Apr 22 19:18:42.109233 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:42.109202 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz"] Apr 22 19:18:42.109462 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:42.109438 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" podUID="06e482a1-3926-4ec3-8cae-5b0148204486" containerName="manager" containerID="cri-o://d11a778c3d44dfc04839f182d65613edfd74a04623f4a3578513b54cea47bb6b" gracePeriod=10 Apr 22 19:18:42.333756 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:42.333733 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" Apr 22 19:18:42.449598 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:42.449570 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm95x\" (UniqueName: \"kubernetes.io/projected/06e482a1-3926-4ec3-8cae-5b0148204486-kube-api-access-bm95x\") pod \"06e482a1-3926-4ec3-8cae-5b0148204486\" (UID: \"06e482a1-3926-4ec3-8cae-5b0148204486\") " Apr 22 19:18:42.449759 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:42.449628 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/06e482a1-3926-4ec3-8cae-5b0148204486-extensions-socket-volume\") pod \"06e482a1-3926-4ec3-8cae-5b0148204486\" (UID: \"06e482a1-3926-4ec3-8cae-5b0148204486\") " Apr 22 19:18:42.450025 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:42.450001 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06e482a1-3926-4ec3-8cae-5b0148204486-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "06e482a1-3926-4ec3-8cae-5b0148204486" (UID: "06e482a1-3926-4ec3-8cae-5b0148204486"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:18:42.451359 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:42.451336 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06e482a1-3926-4ec3-8cae-5b0148204486-kube-api-access-bm95x" (OuterVolumeSpecName: "kube-api-access-bm95x") pod "06e482a1-3926-4ec3-8cae-5b0148204486" (UID: "06e482a1-3926-4ec3-8cae-5b0148204486"). InnerVolumeSpecName "kube-api-access-bm95x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:18:42.550941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:42.550916 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/06e482a1-3926-4ec3-8cae-5b0148204486-extensions-socket-volume\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:18:42.550941 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:42.550936 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bm95x\" (UniqueName: \"kubernetes.io/projected/06e482a1-3926-4ec3-8cae-5b0148204486-kube-api-access-bm95x\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:18:43.082355 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:43.082324 2576 generic.go:358] "Generic (PLEG): container finished" podID="06e482a1-3926-4ec3-8cae-5b0148204486" containerID="d11a778c3d44dfc04839f182d65613edfd74a04623f4a3578513b54cea47bb6b" exitCode=0 Apr 22 19:18:43.082783 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:43.082408 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" Apr 22 19:18:43.082783 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:43.082408 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" event={"ID":"06e482a1-3926-4ec3-8cae-5b0148204486","Type":"ContainerDied","Data":"d11a778c3d44dfc04839f182d65613edfd74a04623f4a3578513b54cea47bb6b"} Apr 22 19:18:43.082783 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:43.082447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz" event={"ID":"06e482a1-3926-4ec3-8cae-5b0148204486","Type":"ContainerDied","Data":"2a33e9a399fcdb8e1fbd1553184be580c775b0f5e5d329b57c8016129197c473"} Apr 22 19:18:43.082783 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:43.082462 2576 scope.go:117] "RemoveContainer" containerID="d11a778c3d44dfc04839f182d65613edfd74a04623f4a3578513b54cea47bb6b" Apr 22 19:18:43.092653 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:43.092638 2576 scope.go:117] "RemoveContainer" containerID="d11a778c3d44dfc04839f182d65613edfd74a04623f4a3578513b54cea47bb6b" Apr 22 19:18:43.092888 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:18:43.092873 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d11a778c3d44dfc04839f182d65613edfd74a04623f4a3578513b54cea47bb6b\": container with ID starting with d11a778c3d44dfc04839f182d65613edfd74a04623f4a3578513b54cea47bb6b not found: ID does not exist" containerID="d11a778c3d44dfc04839f182d65613edfd74a04623f4a3578513b54cea47bb6b" Apr 22 19:18:43.092932 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:43.092895 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d11a778c3d44dfc04839f182d65613edfd74a04623f4a3578513b54cea47bb6b"} err="failed to get container status \"d11a778c3d44dfc04839f182d65613edfd74a04623f4a3578513b54cea47bb6b\": rpc error: code = NotFound desc = could not find container \"d11a778c3d44dfc04839f182d65613edfd74a04623f4a3578513b54cea47bb6b\": container with ID starting with d11a778c3d44dfc04839f182d65613edfd74a04623f4a3578513b54cea47bb6b not found: ID does not exist" Apr 22 19:18:43.107355 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:43.107332 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz"] Apr 22 19:18:43.114110 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:43.114085 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gz7zz"] Apr 22 19:18:43.884003 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:43.883972 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06e482a1-3926-4ec3-8cae-5b0148204486" path="/var/lib/kubelet/pods/06e482a1-3926-4ec3-8cae-5b0148204486/volumes" Apr 22 19:18:58.379138 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.379104 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds"] Apr 22 19:18:58.379509 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.379368 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45f2f3f3-1169-47f6-95de-c7da41d80224" containerName="manager" Apr 22 19:18:58.379509 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.379378 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f2f3f3-1169-47f6-95de-c7da41d80224" containerName="manager" Apr 22 19:18:58.379509 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.379398 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06e482a1-3926-4ec3-8cae-5b0148204486" containerName="manager" Apr 22 19:18:58.379509 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.379404 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e482a1-3926-4ec3-8cae-5b0148204486" containerName="manager" Apr 22 19:18:58.379509 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.379448 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="45f2f3f3-1169-47f6-95de-c7da41d80224" containerName="manager" Apr 22 19:18:58.379509 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.379457 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="06e482a1-3926-4ec3-8cae-5b0148204486" containerName="manager" Apr 22 19:18:58.383756 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.383737 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.386694 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.386672 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-k7tmd\"" Apr 22 19:18:58.403208 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.403181 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds"] Apr 22 19:18:58.463723 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.463698 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk5qw\" (UniqueName: \"kubernetes.io/projected/4c504779-1d51-402a-846b-cd36eb6c5927-kube-api-access-dk5qw\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.463811 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.463741 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/4c504779-1d51-402a-846b-cd36eb6c5927-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.463811 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.463792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/4c504779-1d51-402a-846b-cd36eb6c5927-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.463883 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.463829 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/4c504779-1d51-402a-846b-cd36eb6c5927-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.463883 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.463851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/4c504779-1d51-402a-846b-cd36eb6c5927-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.463883 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.463867 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4c504779-1d51-402a-846b-cd36eb6c5927-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.463968 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.463888 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4c504779-1d51-402a-846b-cd36eb6c5927-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.463968 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.463909 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/4c504779-1d51-402a-846b-cd36eb6c5927-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.464032 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.463978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/4c504779-1d51-402a-846b-cd36eb6c5927-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.564342 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.564312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dk5qw\" (UniqueName: \"kubernetes.io/projected/4c504779-1d51-402a-846b-cd36eb6c5927-kube-api-access-dk5qw\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.564464 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.564354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/4c504779-1d51-402a-846b-cd36eb6c5927-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.564464 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.564381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/4c504779-1d51-402a-846b-cd36eb6c5927-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.564464 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.564412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/4c504779-1d51-402a-846b-cd36eb6c5927-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.564464 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.564437 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/4c504779-1d51-402a-846b-cd36eb6c5927-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.564687 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.564463 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4c504779-1d51-402a-846b-cd36eb6c5927-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.564687 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.564492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4c504779-1d51-402a-846b-cd36eb6c5927-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.564687 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.564640 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/4c504779-1d51-402a-846b-cd36eb6c5927-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.564849 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.564706 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/4c504779-1d51-402a-846b-cd36eb6c5927-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.564849 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.564782 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/4c504779-1d51-402a-846b-cd36eb6c5927-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.564951 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.564925 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/4c504779-1d51-402a-846b-cd36eb6c5927-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.565011 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.564947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/4c504779-1d51-402a-846b-cd36eb6c5927-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.565182 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.565161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/4c504779-1d51-402a-846b-cd36eb6c5927-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.565361 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.565341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/4c504779-1d51-402a-846b-cd36eb6c5927-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.566878 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.566852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/4c504779-1d51-402a-846b-cd36eb6c5927-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.566950 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.566936 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4c504779-1d51-402a-846b-cd36eb6c5927-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.573829 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.573805 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4c504779-1d51-402a-846b-cd36eb6c5927-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.574015 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.573994 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk5qw\" (UniqueName: \"kubernetes.io/projected/4c504779-1d51-402a-846b-cd36eb6c5927-kube-api-access-dk5qw\") pod \"maas-default-gateway-openshift-default-58b6f876-4hzds\" (UID: \"4c504779-1d51-402a-846b-cd36eb6c5927\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.694603 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.694523 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:58.822087 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.822060 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds"] Apr 22 19:18:58.823855 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:18:58.823825 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c504779_1d51_402a_846b_cd36eb6c5927.slice/crio-dd8092a9fe7ef9d0eb44270f33f37e9ca92340377b256a44ee51dda90e8beff1 WatchSource:0}: Error finding container dd8092a9fe7ef9d0eb44270f33f37e9ca92340377b256a44ee51dda90e8beff1: Status 404 returned error can't find the container with id dd8092a9fe7ef9d0eb44270f33f37e9ca92340377b256a44ee51dda90e8beff1 Apr 22 19:18:58.825743 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.825702 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 19:18:58.825841 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.825788 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 19:18:58.825841 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:58.825829 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 22 19:18:59.137390 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:59.137355 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" event={"ID":"4c504779-1d51-402a-846b-cd36eb6c5927","Type":"ContainerStarted","Data":"bc515d3c0096ce5b66bcd00916380899c12e7eb939da00b6d8fe562ef65e773c"} Apr 22 19:18:59.137390 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:59.137390 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" event={"ID":"4c504779-1d51-402a-846b-cd36eb6c5927","Type":"ContainerStarted","Data":"dd8092a9fe7ef9d0eb44270f33f37e9ca92340377b256a44ee51dda90e8beff1"} Apr 22 19:18:59.162167 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:59.162123 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" podStartSLOduration=1.162109425 podStartE2EDuration="1.162109425s" podCreationTimestamp="2026-04-22 19:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:18:59.160885593 +0000 UTC m=+763.841363448" watchObservedRunningTime="2026-04-22 19:18:59.162109425 +0000 UTC m=+763.842587279" Apr 22 19:18:59.694703 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:59.694671 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:18:59.699315 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:18:59.699290 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:19:00.141426 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:00.141396 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:19:00.142305 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:00.142287 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-4hzds" Apr 22 19:19:02.952820 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:02.952745 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-979nb"] Apr 22 19:19:02.956070 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:02.956050 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" Apr 22 19:19:02.958530 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:02.958507 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-6vvpl\"" Apr 22 19:19:02.958696 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:02.958509 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 19:19:02.969040 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:02.969013 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-979nb"] Apr 22 19:19:03.003954 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:03.003925 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtn7h\" (UniqueName: \"kubernetes.io/projected/4d6d90b6-9a30-4957-9c5d-3e88c1a51338-kube-api-access-vtn7h\") pod \"limitador-limitador-7d549b5b-979nb\" (UID: \"4d6d90b6-9a30-4957-9c5d-3e88c1a51338\") " pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" Apr 22 19:19:03.004109 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:03.003965 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4d6d90b6-9a30-4957-9c5d-3e88c1a51338-config-file\") pod \"limitador-limitador-7d549b5b-979nb\" (UID: \"4d6d90b6-9a30-4957-9c5d-3e88c1a51338\") " pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" Apr 22 19:19:03.047793 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:03.047761 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-979nb"] Apr 22 19:19:03.104621 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:03.104594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtn7h\" (UniqueName: \"kubernetes.io/projected/4d6d90b6-9a30-4957-9c5d-3e88c1a51338-kube-api-access-vtn7h\") pod \"limitador-limitador-7d549b5b-979nb\" (UID: \"4d6d90b6-9a30-4957-9c5d-3e88c1a51338\") " pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" Apr 22 19:19:03.104787 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:03.104631 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4d6d90b6-9a30-4957-9c5d-3e88c1a51338-config-file\") pod \"limitador-limitador-7d549b5b-979nb\" (UID: \"4d6d90b6-9a30-4957-9c5d-3e88c1a51338\") " pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" Apr 22 19:19:03.105291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:03.105266 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4d6d90b6-9a30-4957-9c5d-3e88c1a51338-config-file\") pod \"limitador-limitador-7d549b5b-979nb\" (UID: \"4d6d90b6-9a30-4957-9c5d-3e88c1a51338\") " pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" Apr 22 19:19:03.113705 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:03.113682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtn7h\" (UniqueName: \"kubernetes.io/projected/4d6d90b6-9a30-4957-9c5d-3e88c1a51338-kube-api-access-vtn7h\") pod \"limitador-limitador-7d549b5b-979nb\" (UID: \"4d6d90b6-9a30-4957-9c5d-3e88c1a51338\") " pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" Apr 22 19:19:03.265859 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:03.265824 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" Apr 22 19:19:03.387763 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:03.387739 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-979nb"] Apr 22 19:19:03.389728 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:19:03.389693 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d6d90b6_9a30_4957_9c5d_3e88c1a51338.slice/crio-a064b8b156608bd18efb3d4c450f09c7988d760f4c3b8f59e910c8a0b89a6db9 WatchSource:0}: Error finding container a064b8b156608bd18efb3d4c450f09c7988d760f4c3b8f59e910c8a0b89a6db9: Status 404 returned error can't find the container with id a064b8b156608bd18efb3d4c450f09c7988d760f4c3b8f59e910c8a0b89a6db9 Apr 22 19:19:03.775995 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:03.775969 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jl9jc"] Apr 22 19:19:03.780348 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:03.780327 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jl9jc" Apr 22 19:19:03.782877 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:03.782854 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-wwxkk\"" Apr 22 19:19:03.787358 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:03.787337 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jl9jc"] Apr 22 19:19:03.911388 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:03.911358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkxxz\" (UniqueName: \"kubernetes.io/projected/054c9ce7-32d6-4e34-a43a-75993b41334a-kube-api-access-fkxxz\") pod \"authorino-f99f4b5cd-jl9jc\" (UID: \"054c9ce7-32d6-4e34-a43a-75993b41334a\") " pod="kuadrant-system/authorino-f99f4b5cd-jl9jc" Apr 22 19:19:03.914469 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:03.914444 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-lfnd2"] Apr 22 19:19:03.917718 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:03.917696 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-lfnd2" Apr 22 19:19:03.922388 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:03.922365 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-lfnd2"] Apr 22 19:19:04.012829 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:04.012792 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkxxz\" (UniqueName: \"kubernetes.io/projected/054c9ce7-32d6-4e34-a43a-75993b41334a-kube-api-access-fkxxz\") pod \"authorino-f99f4b5cd-jl9jc\" (UID: \"054c9ce7-32d6-4e34-a43a-75993b41334a\") " pod="kuadrant-system/authorino-f99f4b5cd-jl9jc" Apr 22 19:19:04.013290 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:04.012879 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qfmg\" (UniqueName: \"kubernetes.io/projected/20f00377-a013-4ce4-839a-68b91ffb01c2-kube-api-access-7qfmg\") pod \"authorino-7498df8756-lfnd2\" (UID: \"20f00377-a013-4ce4-839a-68b91ffb01c2\") " pod="kuadrant-system/authorino-7498df8756-lfnd2" Apr 22 19:19:04.022603 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:04.022543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkxxz\" (UniqueName: \"kubernetes.io/projected/054c9ce7-32d6-4e34-a43a-75993b41334a-kube-api-access-fkxxz\") pod \"authorino-f99f4b5cd-jl9jc\" (UID: \"054c9ce7-32d6-4e34-a43a-75993b41334a\") " pod="kuadrant-system/authorino-f99f4b5cd-jl9jc" Apr 22 19:19:04.091079 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:04.090630 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jl9jc" Apr 22 19:19:04.115918 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:04.115884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qfmg\" (UniqueName: \"kubernetes.io/projected/20f00377-a013-4ce4-839a-68b91ffb01c2-kube-api-access-7qfmg\") pod \"authorino-7498df8756-lfnd2\" (UID: \"20f00377-a013-4ce4-839a-68b91ffb01c2\") " pod="kuadrant-system/authorino-7498df8756-lfnd2" Apr 22 19:19:04.126371 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:04.126340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qfmg\" (UniqueName: \"kubernetes.io/projected/20f00377-a013-4ce4-839a-68b91ffb01c2-kube-api-access-7qfmg\") pod \"authorino-7498df8756-lfnd2\" (UID: \"20f00377-a013-4ce4-839a-68b91ffb01c2\") " pod="kuadrant-system/authorino-7498df8756-lfnd2" Apr 22 19:19:04.160225 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:04.160173 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" event={"ID":"4d6d90b6-9a30-4957-9c5d-3e88c1a51338","Type":"ContainerStarted","Data":"a064b8b156608bd18efb3d4c450f09c7988d760f4c3b8f59e910c8a0b89a6db9"} Apr 22 19:19:04.228007 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:04.227969 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-lfnd2" Apr 22 19:19:04.263883 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:04.263833 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jl9jc"] Apr 22 19:19:04.268196 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:19:04.268162 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod054c9ce7_32d6_4e34_a43a_75993b41334a.slice/crio-7bcfc6ac915a4635b1c071ac9f36c9af5018a021ed5ff125a6785c436fe4acb5 WatchSource:0}: Error finding container 7bcfc6ac915a4635b1c071ac9f36c9af5018a021ed5ff125a6785c436fe4acb5: Status 404 returned error can't find the container with id 7bcfc6ac915a4635b1c071ac9f36c9af5018a021ed5ff125a6785c436fe4acb5 Apr 22 19:19:04.401230 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:04.401198 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-lfnd2"] Apr 22 19:19:04.402844 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:19:04.402799 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20f00377_a013_4ce4_839a_68b91ffb01c2.slice/crio-73898b714741e32bc2b9d1e1341e249efa117ce20d422d7914b29015becce160 WatchSource:0}: Error finding container 73898b714741e32bc2b9d1e1341e249efa117ce20d422d7914b29015becce160: Status 404 returned error can't find the container with id 73898b714741e32bc2b9d1e1341e249efa117ce20d422d7914b29015becce160 Apr 22 19:19:05.166056 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:05.166018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-lfnd2" event={"ID":"20f00377-a013-4ce4-839a-68b91ffb01c2","Type":"ContainerStarted","Data":"73898b714741e32bc2b9d1e1341e249efa117ce20d422d7914b29015becce160"} Apr 22 19:19:05.167268 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:05.167237 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jl9jc" event={"ID":"054c9ce7-32d6-4e34-a43a-75993b41334a","Type":"ContainerStarted","Data":"7bcfc6ac915a4635b1c071ac9f36c9af5018a021ed5ff125a6785c436fe4acb5"} Apr 22 19:19:07.181010 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:07.180948 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" event={"ID":"4d6d90b6-9a30-4957-9c5d-3e88c1a51338","Type":"ContainerStarted","Data":"fee79f9e1d23a0b9b17e80483f9220e1460c33b661ea39fc6e8a7a091b3e7282"} Apr 22 19:19:07.181642 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:07.181620 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" Apr 22 19:19:07.204453 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:07.204397 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" podStartSLOduration=2.379107473 podStartE2EDuration="5.20438069s" podCreationTimestamp="2026-04-22 19:19:02 +0000 UTC" firstStartedPulling="2026-04-22 19:19:03.391583237 +0000 UTC m=+768.072061075" lastFinishedPulling="2026-04-22 19:19:06.216856457 +0000 UTC m=+770.897334292" observedRunningTime="2026-04-22 19:19:07.20267508 +0000 UTC m=+771.883152962" watchObservedRunningTime="2026-04-22 19:19:07.20438069 +0000 UTC m=+771.884858545" Apr 22 19:19:09.189822 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:09.189789 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-lfnd2" event={"ID":"20f00377-a013-4ce4-839a-68b91ffb01c2","Type":"ContainerStarted","Data":"b830f9ecb2b69aaf59fcd543fd85c5ecddb86b24f6dd9e1c811ed4eca89fcc69"} Apr 22 19:19:09.191137 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:09.191108 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jl9jc" event={"ID":"054c9ce7-32d6-4e34-a43a-75993b41334a","Type":"ContainerStarted","Data":"e59926b1663a2c04ba596a24d49bfca16ac9713d34dd67d51b31edc8f92839fb"} Apr 22 19:19:09.208393 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:09.208345 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-lfnd2" podStartSLOduration=2.050870592 podStartE2EDuration="6.208331888s" podCreationTimestamp="2026-04-22 19:19:03 +0000 UTC" firstStartedPulling="2026-04-22 19:19:04.40511138 +0000 UTC m=+769.085589227" lastFinishedPulling="2026-04-22 19:19:08.562572685 +0000 UTC m=+773.243050523" observedRunningTime="2026-04-22 19:19:09.206449233 +0000 UTC m=+773.886927089" watchObservedRunningTime="2026-04-22 19:19:09.208331888 +0000 UTC m=+773.888809742" Apr 22 19:19:09.226630 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:09.226586 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-jl9jc" podStartSLOduration=1.922922926 podStartE2EDuration="6.226569942s" podCreationTimestamp="2026-04-22 19:19:03 +0000 UTC" firstStartedPulling="2026-04-22 19:19:04.270165161 +0000 UTC m=+768.950642998" lastFinishedPulling="2026-04-22 19:19:08.573812182 +0000 UTC m=+773.254290014" observedRunningTime="2026-04-22 19:19:09.224321849 +0000 UTC m=+773.904799704" watchObservedRunningTime="2026-04-22 19:19:09.226569942 +0000 UTC m=+773.907047799" Apr 22 19:19:09.240485 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:09.240456 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jl9jc"] Apr 22 19:19:11.197898 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:11.197803 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-jl9jc" podUID="054c9ce7-32d6-4e34-a43a-75993b41334a" containerName="authorino" containerID="cri-o://e59926b1663a2c04ba596a24d49bfca16ac9713d34dd67d51b31edc8f92839fb" gracePeriod=30 Apr 22 19:19:11.436065 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:11.436038 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jl9jc" Apr 22 19:19:11.590933 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:11.590900 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkxxz\" (UniqueName: \"kubernetes.io/projected/054c9ce7-32d6-4e34-a43a-75993b41334a-kube-api-access-fkxxz\") pod \"054c9ce7-32d6-4e34-a43a-75993b41334a\" (UID: \"054c9ce7-32d6-4e34-a43a-75993b41334a\") " Apr 22 19:19:11.592911 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:11.592880 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/054c9ce7-32d6-4e34-a43a-75993b41334a-kube-api-access-fkxxz" (OuterVolumeSpecName: "kube-api-access-fkxxz") pod "054c9ce7-32d6-4e34-a43a-75993b41334a" (UID: "054c9ce7-32d6-4e34-a43a-75993b41334a"). InnerVolumeSpecName "kube-api-access-fkxxz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:19:11.692357 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:11.692321 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fkxxz\" (UniqueName: \"kubernetes.io/projected/054c9ce7-32d6-4e34-a43a-75993b41334a-kube-api-access-fkxxz\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:19:12.202812 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:12.202725 2576 generic.go:358] "Generic (PLEG): container finished" podID="054c9ce7-32d6-4e34-a43a-75993b41334a" containerID="e59926b1663a2c04ba596a24d49bfca16ac9713d34dd67d51b31edc8f92839fb" exitCode=0 Apr 22 19:19:12.202812 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:12.202781 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jl9jc" Apr 22 19:19:12.202812 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:12.202799 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jl9jc" event={"ID":"054c9ce7-32d6-4e34-a43a-75993b41334a","Type":"ContainerDied","Data":"e59926b1663a2c04ba596a24d49bfca16ac9713d34dd67d51b31edc8f92839fb"} Apr 22 19:19:12.203280 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:12.202824 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jl9jc" event={"ID":"054c9ce7-32d6-4e34-a43a-75993b41334a","Type":"ContainerDied","Data":"7bcfc6ac915a4635b1c071ac9f36c9af5018a021ed5ff125a6785c436fe4acb5"} Apr 22 19:19:12.203280 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:12.202838 2576 scope.go:117] "RemoveContainer" containerID="e59926b1663a2c04ba596a24d49bfca16ac9713d34dd67d51b31edc8f92839fb" Apr 22 19:19:12.210734 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:12.210703 2576 scope.go:117] "RemoveContainer" containerID="e59926b1663a2c04ba596a24d49bfca16ac9713d34dd67d51b31edc8f92839fb" Apr 22 19:19:12.210976 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:19:12.210956 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e59926b1663a2c04ba596a24d49bfca16ac9713d34dd67d51b31edc8f92839fb\": container with ID starting with e59926b1663a2c04ba596a24d49bfca16ac9713d34dd67d51b31edc8f92839fb not found: ID does not exist" containerID="e59926b1663a2c04ba596a24d49bfca16ac9713d34dd67d51b31edc8f92839fb" Apr 22 19:19:12.211032 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:12.210982 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e59926b1663a2c04ba596a24d49bfca16ac9713d34dd67d51b31edc8f92839fb"} err="failed to get container status \"e59926b1663a2c04ba596a24d49bfca16ac9713d34dd67d51b31edc8f92839fb\": rpc error: code = NotFound desc = could not find container \"e59926b1663a2c04ba596a24d49bfca16ac9713d34dd67d51b31edc8f92839fb\": container with ID starting with e59926b1663a2c04ba596a24d49bfca16ac9713d34dd67d51b31edc8f92839fb not found: ID does not exist" Apr 22 19:19:12.224083 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:12.224060 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jl9jc"] Apr 22 19:19:12.229335 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:12.229315 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jl9jc"] Apr 22 19:19:13.884497 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:13.884464 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="054c9ce7-32d6-4e34-a43a-75993b41334a" path="/var/lib/kubelet/pods/054c9ce7-32d6-4e34-a43a-75993b41334a/volumes" Apr 22 19:19:17.768293 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:17.768259 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-979nb"] Apr 22 19:19:17.768692 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:17.768484 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" podUID="4d6d90b6-9a30-4957-9c5d-3e88c1a51338" containerName="limitador" containerID="cri-o://fee79f9e1d23a0b9b17e80483f9220e1460c33b661ea39fc6e8a7a091b3e7282" gracePeriod=30 Apr 22 19:19:17.769131 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:17.769109 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" Apr 22 19:19:18.227031 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:18.227001 2576 generic.go:358] "Generic (PLEG): container finished" podID="4d6d90b6-9a30-4957-9c5d-3e88c1a51338" containerID="fee79f9e1d23a0b9b17e80483f9220e1460c33b661ea39fc6e8a7a091b3e7282" exitCode=0 Apr 22 19:19:18.227188 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:18.227057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" event={"ID":"4d6d90b6-9a30-4957-9c5d-3e88c1a51338","Type":"ContainerDied","Data":"fee79f9e1d23a0b9b17e80483f9220e1460c33b661ea39fc6e8a7a091b3e7282"} Apr 22 19:19:18.704778 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:18.704753 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" Apr 22 19:19:18.846140 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:18.846078 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtn7h\" (UniqueName: \"kubernetes.io/projected/4d6d90b6-9a30-4957-9c5d-3e88c1a51338-kube-api-access-vtn7h\") pod \"4d6d90b6-9a30-4957-9c5d-3e88c1a51338\" (UID: \"4d6d90b6-9a30-4957-9c5d-3e88c1a51338\") " Apr 22 19:19:18.846468 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:18.846144 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4d6d90b6-9a30-4957-9c5d-3e88c1a51338-config-file\") pod \"4d6d90b6-9a30-4957-9c5d-3e88c1a51338\" (UID: \"4d6d90b6-9a30-4957-9c5d-3e88c1a51338\") " Apr 22 19:19:18.846507 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:18.846468 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d6d90b6-9a30-4957-9c5d-3e88c1a51338-config-file" (OuterVolumeSpecName: "config-file") pod "4d6d90b6-9a30-4957-9c5d-3e88c1a51338" (UID: "4d6d90b6-9a30-4957-9c5d-3e88c1a51338"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:19:18.848026 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:18.848004 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d6d90b6-9a30-4957-9c5d-3e88c1a51338-kube-api-access-vtn7h" (OuterVolumeSpecName: "kube-api-access-vtn7h") pod "4d6d90b6-9a30-4957-9c5d-3e88c1a51338" (UID: "4d6d90b6-9a30-4957-9c5d-3e88c1a51338"). InnerVolumeSpecName "kube-api-access-vtn7h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:19:18.947396 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:18.947371 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vtn7h\" (UniqueName: \"kubernetes.io/projected/4d6d90b6-9a30-4957-9c5d-3e88c1a51338-kube-api-access-vtn7h\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:19:18.947396 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:18.947394 2576 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4d6d90b6-9a30-4957-9c5d-3e88c1a51338-config-file\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:19:19.231681 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:19.231607 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" event={"ID":"4d6d90b6-9a30-4957-9c5d-3e88c1a51338","Type":"ContainerDied","Data":"a064b8b156608bd18efb3d4c450f09c7988d760f4c3b8f59e910c8a0b89a6db9"} Apr 22 19:19:19.231681 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:19.231645 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-979nb" Apr 22 19:19:19.231850 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:19.231650 2576 scope.go:117] "RemoveContainer" containerID="fee79f9e1d23a0b9b17e80483f9220e1460c33b661ea39fc6e8a7a091b3e7282" Apr 22 19:19:19.254543 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:19.254518 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-979nb"] Apr 22 19:19:19.261809 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:19.261788 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-979nb"] Apr 22 19:19:19.883543 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:19.883514 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d6d90b6-9a30-4957-9c5d-3e88c1a51338" path="/var/lib/kubelet/pods/4d6d90b6-9a30-4957-9c5d-3e88c1a51338/volumes" Apr 22 19:19:31.287068 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.287034 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-j7k86"] Apr 22 19:19:31.287455 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.287439 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="054c9ce7-32d6-4e34-a43a-75993b41334a" containerName="authorino" Apr 22 19:19:31.287492 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.287457 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="054c9ce7-32d6-4e34-a43a-75993b41334a" containerName="authorino" Apr 22 19:19:31.287492 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.287484 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d6d90b6-9a30-4957-9c5d-3e88c1a51338" containerName="limitador" Apr 22 19:19:31.287492 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.287489 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d6d90b6-9a30-4957-9c5d-3e88c1a51338" containerName="limitador" Apr 22 19:19:31.287605 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.287535 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d6d90b6-9a30-4957-9c5d-3e88c1a51338" containerName="limitador" Apr 22 19:19:31.287605 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.287558 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="054c9ce7-32d6-4e34-a43a-75993b41334a" containerName="authorino" Apr 22 19:19:31.291658 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.291641 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-j7k86" Apr 22 19:19:31.304207 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.304182 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-j7k86"] Apr 22 19:19:31.337523 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.337495 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lskj6\" (UniqueName: \"kubernetes.io/projected/ba32232c-87a1-47d3-a63a-431694766a0d-kube-api-access-lskj6\") pod \"authorino-8b475cf9f-j7k86\" (UID: \"ba32232c-87a1-47d3-a63a-431694766a0d\") " pod="kuadrant-system/authorino-8b475cf9f-j7k86" Apr 22 19:19:31.438066 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.438036 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lskj6\" (UniqueName: \"kubernetes.io/projected/ba32232c-87a1-47d3-a63a-431694766a0d-kube-api-access-lskj6\") pod \"authorino-8b475cf9f-j7k86\" (UID: \"ba32232c-87a1-47d3-a63a-431694766a0d\") " pod="kuadrant-system/authorino-8b475cf9f-j7k86" Apr 22 19:19:31.448868 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.448841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lskj6\" (UniqueName: \"kubernetes.io/projected/ba32232c-87a1-47d3-a63a-431694766a0d-kube-api-access-lskj6\") pod \"authorino-8b475cf9f-j7k86\" (UID: \"ba32232c-87a1-47d3-a63a-431694766a0d\") " pod="kuadrant-system/authorino-8b475cf9f-j7k86" Apr 22 19:19:31.549451 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.549395 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-j7k86"] Apr 22 19:19:31.549628 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.549613 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-j7k86" Apr 22 19:19:31.653794 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.653751 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-675f7964d8-bvrfv"] Apr 22 19:19:31.661195 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.661167 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-675f7964d8-bvrfv" Apr 22 19:19:31.711164 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.711143 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-j7k86"] Apr 22 19:19:31.712678 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:19:31.712653 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba32232c_87a1_47d3_a63a_431694766a0d.slice/crio-2f023ee1b058af0f2c60a6f7a3afacc6fb65ca23e332077c0cd7a0776e3dc649 WatchSource:0}: Error finding container 2f023ee1b058af0f2c60a6f7a3afacc6fb65ca23e332077c0cd7a0776e3dc649: Status 404 returned error can't find the container with id 2f023ee1b058af0f2c60a6f7a3afacc6fb65ca23e332077c0cd7a0776e3dc649 Apr 22 19:19:31.740125 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.740101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxwlr\" (UniqueName: \"kubernetes.io/projected/52cb2651-45ad-4a05-9134-239af4efa2f1-kube-api-access-xxwlr\") pod \"authorino-675f7964d8-bvrfv\" (UID: \"52cb2651-45ad-4a05-9134-239af4efa2f1\") " pod="kuadrant-system/authorino-675f7964d8-bvrfv" Apr 22 19:19:31.743362 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.743339 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-675f7964d8-bvrfv"] Apr 22 19:19:31.841112 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.841060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxwlr\" (UniqueName: \"kubernetes.io/projected/52cb2651-45ad-4a05-9134-239af4efa2f1-kube-api-access-xxwlr\") pod \"authorino-675f7964d8-bvrfv\" (UID: \"52cb2651-45ad-4a05-9134-239af4efa2f1\") " pod="kuadrant-system/authorino-675f7964d8-bvrfv" Apr 22 19:19:31.871603 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.871583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxwlr\" (UniqueName: \"kubernetes.io/projected/52cb2651-45ad-4a05-9134-239af4efa2f1-kube-api-access-xxwlr\") pod \"authorino-675f7964d8-bvrfv\" (UID: \"52cb2651-45ad-4a05-9134-239af4efa2f1\") " pod="kuadrant-system/authorino-675f7964d8-bvrfv" Apr 22 19:19:31.924637 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.924614 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-675f7964d8-bvrfv"] Apr 22 19:19:31.924806 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.924795 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-675f7964d8-bvrfv" Apr 22 19:19:31.961802 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.961274 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5549cfc44-kq958"] Apr 22 19:19:31.966217 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.966198 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5549cfc44-kq958" Apr 22 19:19:31.970709 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.970465 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 22 19:19:31.978116 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:31.978089 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5549cfc44-kq958"] Apr 22 19:19:32.042156 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.042128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcg2n\" (UniqueName: \"kubernetes.io/projected/8f040d06-9c31-401f-83d2-90969b7f25d9-kube-api-access-rcg2n\") pod \"authorino-5549cfc44-kq958\" (UID: \"8f040d06-9c31-401f-83d2-90969b7f25d9\") " pod="kuadrant-system/authorino-5549cfc44-kq958" Apr 22 19:19:32.042274 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.042173 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8f040d06-9c31-401f-83d2-90969b7f25d9-tls-cert\") pod \"authorino-5549cfc44-kq958\" (UID: \"8f040d06-9c31-401f-83d2-90969b7f25d9\") " pod="kuadrant-system/authorino-5549cfc44-kq958" Apr 22 19:19:32.053209 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.053186 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-675f7964d8-bvrfv"] Apr 22 19:19:32.054492 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:19:32.054472 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52cb2651_45ad_4a05_9134_239af4efa2f1.slice/crio-744340ed5f4f3199e5070409d7d14beeb46fcb348ed6b5bb291e1d07b2e56f85 WatchSource:0}: Error finding container 744340ed5f4f3199e5070409d7d14beeb46fcb348ed6b5bb291e1d07b2e56f85: Status 404 returned error can't find the container with id 744340ed5f4f3199e5070409d7d14beeb46fcb348ed6b5bb291e1d07b2e56f85 Apr 22 19:19:32.143122 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.143097 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcg2n\" (UniqueName: \"kubernetes.io/projected/8f040d06-9c31-401f-83d2-90969b7f25d9-kube-api-access-rcg2n\") pod \"authorino-5549cfc44-kq958\" (UID: \"8f040d06-9c31-401f-83d2-90969b7f25d9\") " pod="kuadrant-system/authorino-5549cfc44-kq958" Apr 22 19:19:32.143255 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.143149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8f040d06-9c31-401f-83d2-90969b7f25d9-tls-cert\") pod \"authorino-5549cfc44-kq958\" (UID: \"8f040d06-9c31-401f-83d2-90969b7f25d9\") " pod="kuadrant-system/authorino-5549cfc44-kq958" Apr 22 19:19:32.145653 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.145630 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8f040d06-9c31-401f-83d2-90969b7f25d9-tls-cert\") pod \"authorino-5549cfc44-kq958\" (UID: \"8f040d06-9c31-401f-83d2-90969b7f25d9\") " pod="kuadrant-system/authorino-5549cfc44-kq958" Apr 22 19:19:32.152353 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.152332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcg2n\" (UniqueName: \"kubernetes.io/projected/8f040d06-9c31-401f-83d2-90969b7f25d9-kube-api-access-rcg2n\") pod \"authorino-5549cfc44-kq958\" (UID: \"8f040d06-9c31-401f-83d2-90969b7f25d9\") " pod="kuadrant-system/authorino-5549cfc44-kq958" Apr 22 19:19:32.279984 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.279939 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5549cfc44-kq958" Apr 22 19:19:32.285472 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.285439 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-675f7964d8-bvrfv" event={"ID":"52cb2651-45ad-4a05-9134-239af4efa2f1","Type":"ContainerStarted","Data":"744340ed5f4f3199e5070409d7d14beeb46fcb348ed6b5bb291e1d07b2e56f85"} Apr 22 19:19:32.286871 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.286843 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-j7k86" event={"ID":"ba32232c-87a1-47d3-a63a-431694766a0d","Type":"ContainerStarted","Data":"bc841d9f6a8bdeb19869b380e0a57fe09374bfdf6a33079ba2b806a53ee094ad"} Apr 22 19:19:32.286971 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.286878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-j7k86" event={"ID":"ba32232c-87a1-47d3-a63a-431694766a0d","Type":"ContainerStarted","Data":"2f023ee1b058af0f2c60a6f7a3afacc6fb65ca23e332077c0cd7a0776e3dc649"} Apr 22 19:19:32.286971 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.286911 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-j7k86" podUID="ba32232c-87a1-47d3-a63a-431694766a0d" containerName="authorino" containerID="cri-o://bc841d9f6a8bdeb19869b380e0a57fe09374bfdf6a33079ba2b806a53ee094ad" gracePeriod=30 Apr 22 19:19:32.312015 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.311968 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-j7k86" podStartSLOduration=0.958203254 podStartE2EDuration="1.311953478s" podCreationTimestamp="2026-04-22 19:19:31 +0000 UTC" firstStartedPulling="2026-04-22 19:19:31.714010121 +0000 UTC m=+796.394487968" lastFinishedPulling="2026-04-22 19:19:32.067760359 +0000 UTC m=+796.748238192" observedRunningTime="2026-04-22 19:19:32.311058228 +0000 UTC m=+796.991536085" watchObservedRunningTime="2026-04-22 19:19:32.311953478 +0000 UTC m=+796.992431333" Apr 22 19:19:32.408468 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.408438 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5549cfc44-kq958"] Apr 22 19:19:32.409518 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:19:32.409490 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f040d06_9c31_401f_83d2_90969b7f25d9.slice/crio-375d84c2e8a625703062d1f82a1e2e634effb2edc972e573f3c5dc78ec3b9bd7 WatchSource:0}: Error finding container 375d84c2e8a625703062d1f82a1e2e634effb2edc972e573f3c5dc78ec3b9bd7: Status 404 returned error can't find the container with id 375d84c2e8a625703062d1f82a1e2e634effb2edc972e573f3c5dc78ec3b9bd7 Apr 22 19:19:32.536104 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.536079 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-j7k86" Apr 22 19:19:32.545385 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.545365 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lskj6\" (UniqueName: \"kubernetes.io/projected/ba32232c-87a1-47d3-a63a-431694766a0d-kube-api-access-lskj6\") pod \"ba32232c-87a1-47d3-a63a-431694766a0d\" (UID: \"ba32232c-87a1-47d3-a63a-431694766a0d\") " Apr 22 19:19:32.547239 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.547212 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba32232c-87a1-47d3-a63a-431694766a0d-kube-api-access-lskj6" (OuterVolumeSpecName: "kube-api-access-lskj6") pod "ba32232c-87a1-47d3-a63a-431694766a0d" (UID: "ba32232c-87a1-47d3-a63a-431694766a0d"). InnerVolumeSpecName "kube-api-access-lskj6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:19:32.646631 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:32.646542 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lskj6\" (UniqueName: \"kubernetes.io/projected/ba32232c-87a1-47d3-a63a-431694766a0d-kube-api-access-lskj6\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:19:33.291414 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.291376 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-675f7964d8-bvrfv" event={"ID":"52cb2651-45ad-4a05-9134-239af4efa2f1","Type":"ContainerStarted","Data":"72cb88eaa495bb4903a0431d1402b303a5748c6b28b0421361eceba4ce1068a7"} Apr 22 19:19:33.291729 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.291448 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-675f7964d8-bvrfv" podUID="52cb2651-45ad-4a05-9134-239af4efa2f1" containerName="authorino" containerID="cri-o://72cb88eaa495bb4903a0431d1402b303a5748c6b28b0421361eceba4ce1068a7" gracePeriod=30 Apr 22 19:19:33.292566 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.292528 2576 generic.go:358] "Generic (PLEG): container finished" podID="ba32232c-87a1-47d3-a63a-431694766a0d" containerID="bc841d9f6a8bdeb19869b380e0a57fe09374bfdf6a33079ba2b806a53ee094ad" exitCode=0 Apr 22 19:19:33.292645 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.292592 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-j7k86" Apr 22 19:19:33.292706 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.292590 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-j7k86" event={"ID":"ba32232c-87a1-47d3-a63a-431694766a0d","Type":"ContainerDied","Data":"bc841d9f6a8bdeb19869b380e0a57fe09374bfdf6a33079ba2b806a53ee094ad"} Apr 22 19:19:33.292706 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.292700 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-j7k86" event={"ID":"ba32232c-87a1-47d3-a63a-431694766a0d","Type":"ContainerDied","Data":"2f023ee1b058af0f2c60a6f7a3afacc6fb65ca23e332077c0cd7a0776e3dc649"} Apr 22 19:19:33.292812 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.292720 2576 scope.go:117] "RemoveContainer" containerID="bc841d9f6a8bdeb19869b380e0a57fe09374bfdf6a33079ba2b806a53ee094ad" Apr 22 19:19:33.294179 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.294154 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5549cfc44-kq958" event={"ID":"8f040d06-9c31-401f-83d2-90969b7f25d9","Type":"ContainerStarted","Data":"15a469231b14a7e285fcd3f03afa3eee7ba4e3a3bc622a446c1bbf325e30ad68"} Apr 22 19:19:33.294281 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.294185 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5549cfc44-kq958" event={"ID":"8f040d06-9c31-401f-83d2-90969b7f25d9","Type":"ContainerStarted","Data":"375d84c2e8a625703062d1f82a1e2e634effb2edc972e573f3c5dc78ec3b9bd7"} Apr 22 19:19:33.301942 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.301920 2576 scope.go:117] "RemoveContainer" containerID="bc841d9f6a8bdeb19869b380e0a57fe09374bfdf6a33079ba2b806a53ee094ad" Apr 22 19:19:33.302202 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:19:33.302182 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc841d9f6a8bdeb19869b380e0a57fe09374bfdf6a33079ba2b806a53ee094ad\": container with ID starting with bc841d9f6a8bdeb19869b380e0a57fe09374bfdf6a33079ba2b806a53ee094ad not found: ID does not exist" containerID="bc841d9f6a8bdeb19869b380e0a57fe09374bfdf6a33079ba2b806a53ee094ad" Apr 22 19:19:33.302289 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.302206 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc841d9f6a8bdeb19869b380e0a57fe09374bfdf6a33079ba2b806a53ee094ad"} err="failed to get container status \"bc841d9f6a8bdeb19869b380e0a57fe09374bfdf6a33079ba2b806a53ee094ad\": rpc error: code = NotFound desc = could not find container \"bc841d9f6a8bdeb19869b380e0a57fe09374bfdf6a33079ba2b806a53ee094ad\": container with ID starting with bc841d9f6a8bdeb19869b380e0a57fe09374bfdf6a33079ba2b806a53ee094ad not found: ID does not exist" Apr 22 19:19:33.312887 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.312848 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-675f7964d8-bvrfv" podStartSLOduration=2.009329953 podStartE2EDuration="2.312837089s" podCreationTimestamp="2026-04-22 19:19:31 +0000 UTC" firstStartedPulling="2026-04-22 19:19:32.055712368 +0000 UTC m=+796.736190204" lastFinishedPulling="2026-04-22 19:19:32.359219494 +0000 UTC m=+797.039697340" observedRunningTime="2026-04-22 19:19:33.310377187 +0000 UTC m=+797.990855043" watchObservedRunningTime="2026-04-22 19:19:33.312837089 +0000 UTC m=+797.993314963" Apr 22 19:19:33.334341 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.334306 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5549cfc44-kq958" podStartSLOduration=2.031331739 podStartE2EDuration="2.334296051s" podCreationTimestamp="2026-04-22 19:19:31 +0000 UTC" firstStartedPulling="2026-04-22 19:19:32.411012481 +0000 UTC m=+797.091490327" lastFinishedPulling="2026-04-22 19:19:32.713976807 +0000 UTC m=+797.394454639" observedRunningTime="2026-04-22 19:19:33.33061926 +0000 UTC m=+798.011097114" watchObservedRunningTime="2026-04-22 19:19:33.334296051 +0000 UTC m=+798.014773907" Apr 22 19:19:33.353263 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.353242 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-j7k86"] Apr 22 19:19:33.360443 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.360424 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-j7k86"] Apr 22 19:19:33.368990 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.368970 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-lfnd2"] Apr 22 19:19:33.369162 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.369145 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-lfnd2" podUID="20f00377-a013-4ce4-839a-68b91ffb01c2" containerName="authorino" containerID="cri-o://b830f9ecb2b69aaf59fcd543fd85c5ecddb86b24f6dd9e1c811ed4eca89fcc69" gracePeriod=30 Apr 22 19:19:33.623162 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.623144 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-lfnd2" Apr 22 19:19:33.626279 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.626262 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-675f7964d8-bvrfv" Apr 22 19:19:33.652271 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.652233 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxwlr\" (UniqueName: \"kubernetes.io/projected/52cb2651-45ad-4a05-9134-239af4efa2f1-kube-api-access-xxwlr\") pod \"52cb2651-45ad-4a05-9134-239af4efa2f1\" (UID: \"52cb2651-45ad-4a05-9134-239af4efa2f1\") " Apr 22 19:19:33.652419 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.652313 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qfmg\" (UniqueName: \"kubernetes.io/projected/20f00377-a013-4ce4-839a-68b91ffb01c2-kube-api-access-7qfmg\") pod \"20f00377-a013-4ce4-839a-68b91ffb01c2\" (UID: \"20f00377-a013-4ce4-839a-68b91ffb01c2\") " Apr 22 19:19:33.654198 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.654172 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20f00377-a013-4ce4-839a-68b91ffb01c2-kube-api-access-7qfmg" (OuterVolumeSpecName: "kube-api-access-7qfmg") pod "20f00377-a013-4ce4-839a-68b91ffb01c2" (UID: "20f00377-a013-4ce4-839a-68b91ffb01c2"). InnerVolumeSpecName "kube-api-access-7qfmg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:19:33.654293 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.654275 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52cb2651-45ad-4a05-9134-239af4efa2f1-kube-api-access-xxwlr" (OuterVolumeSpecName: "kube-api-access-xxwlr") pod "52cb2651-45ad-4a05-9134-239af4efa2f1" (UID: "52cb2651-45ad-4a05-9134-239af4efa2f1"). InnerVolumeSpecName "kube-api-access-xxwlr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:19:33.753201 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.753176 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7qfmg\" (UniqueName: \"kubernetes.io/projected/20f00377-a013-4ce4-839a-68b91ffb01c2-kube-api-access-7qfmg\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:19:33.753201 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.753201 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xxwlr\" (UniqueName: \"kubernetes.io/projected/52cb2651-45ad-4a05-9134-239af4efa2f1-kube-api-access-xxwlr\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:19:33.888702 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:33.888635 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba32232c-87a1-47d3-a63a-431694766a0d" path="/var/lib/kubelet/pods/ba32232c-87a1-47d3-a63a-431694766a0d/volumes" Apr 22 19:19:34.235480 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.235413 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-xgnmd"] Apr 22 19:19:34.235739 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.235727 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba32232c-87a1-47d3-a63a-431694766a0d" containerName="authorino" Apr 22 19:19:34.235783 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.235741 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba32232c-87a1-47d3-a63a-431694766a0d" containerName="authorino" Apr 22 19:19:34.235783 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.235759 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52cb2651-45ad-4a05-9134-239af4efa2f1" containerName="authorino" Apr 22 19:19:34.235783 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.235765 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="52cb2651-45ad-4a05-9134-239af4efa2f1" containerName="authorino" Apr 22 19:19:34.235783 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.235774 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20f00377-a013-4ce4-839a-68b91ffb01c2" containerName="authorino" Apr 22 19:19:34.235783 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.235779 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f00377-a013-4ce4-839a-68b91ffb01c2" containerName="authorino" Apr 22 19:19:34.235928 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.235823 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="20f00377-a013-4ce4-839a-68b91ffb01c2" containerName="authorino" Apr 22 19:19:34.235928 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.235833 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba32232c-87a1-47d3-a63a-431694766a0d" containerName="authorino" Apr 22 19:19:34.235928 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.235839 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="52cb2651-45ad-4a05-9134-239af4efa2f1" containerName="authorino" Apr 22 19:19:34.239789 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.239772 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-xgnmd" Apr 22 19:19:34.252746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.252725 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-kbqds\"" Apr 22 19:19:34.253895 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.253875 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-xgnmd"] Apr 22 19:19:34.256141 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.256120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xntv8\" (UniqueName: \"kubernetes.io/projected/3cbd382c-d5a9-4521-9ff2-3dd288e1b28d-kube-api-access-xntv8\") pod \"maas-controller-6d4c8f55f9-xgnmd\" (UID: \"3cbd382c-d5a9-4521-9ff2-3dd288e1b28d\") " pod="opendatahub/maas-controller-6d4c8f55f9-xgnmd" Apr 22 19:19:34.298345 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.298318 2576 generic.go:358] "Generic (PLEG): container finished" podID="20f00377-a013-4ce4-839a-68b91ffb01c2" containerID="b830f9ecb2b69aaf59fcd543fd85c5ecddb86b24f6dd9e1c811ed4eca89fcc69" exitCode=0 Apr 22 19:19:34.298453 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.298381 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-lfnd2" Apr 22 19:19:34.298453 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.298403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-lfnd2" event={"ID":"20f00377-a013-4ce4-839a-68b91ffb01c2","Type":"ContainerDied","Data":"b830f9ecb2b69aaf59fcd543fd85c5ecddb86b24f6dd9e1c811ed4eca89fcc69"} Apr 22 19:19:34.298453 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.298438 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-lfnd2" event={"ID":"20f00377-a013-4ce4-839a-68b91ffb01c2","Type":"ContainerDied","Data":"73898b714741e32bc2b9d1e1341e249efa117ce20d422d7914b29015becce160"} Apr 22 19:19:34.298603 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.298455 2576 scope.go:117] "RemoveContainer" containerID="b830f9ecb2b69aaf59fcd543fd85c5ecddb86b24f6dd9e1c811ed4eca89fcc69" Apr 22 19:19:34.299818 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.299531 2576 generic.go:358] "Generic (PLEG): container finished" podID="52cb2651-45ad-4a05-9134-239af4efa2f1" containerID="72cb88eaa495bb4903a0431d1402b303a5748c6b28b0421361eceba4ce1068a7" exitCode=0 Apr 22 19:19:34.299818 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.299585 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-675f7964d8-bvrfv" event={"ID":"52cb2651-45ad-4a05-9134-239af4efa2f1","Type":"ContainerDied","Data":"72cb88eaa495bb4903a0431d1402b303a5748c6b28b0421361eceba4ce1068a7"} Apr 22 19:19:34.299818 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.299616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-675f7964d8-bvrfv" event={"ID":"52cb2651-45ad-4a05-9134-239af4efa2f1","Type":"ContainerDied","Data":"744340ed5f4f3199e5070409d7d14beeb46fcb348ed6b5bb291e1d07b2e56f85"} Apr 22 19:19:34.299818 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.299621 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-675f7964d8-bvrfv" Apr 22 19:19:34.307099 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.307074 2576 scope.go:117] "RemoveContainer" containerID="b830f9ecb2b69aaf59fcd543fd85c5ecddb86b24f6dd9e1c811ed4eca89fcc69" Apr 22 19:19:34.307317 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:19:34.307300 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b830f9ecb2b69aaf59fcd543fd85c5ecddb86b24f6dd9e1c811ed4eca89fcc69\": container with ID starting with b830f9ecb2b69aaf59fcd543fd85c5ecddb86b24f6dd9e1c811ed4eca89fcc69 not found: ID does not exist" containerID="b830f9ecb2b69aaf59fcd543fd85c5ecddb86b24f6dd9e1c811ed4eca89fcc69" Apr 22 19:19:34.307370 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.307323 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b830f9ecb2b69aaf59fcd543fd85c5ecddb86b24f6dd9e1c811ed4eca89fcc69"} err="failed to get container status \"b830f9ecb2b69aaf59fcd543fd85c5ecddb86b24f6dd9e1c811ed4eca89fcc69\": rpc error: code = NotFound desc = could not find container \"b830f9ecb2b69aaf59fcd543fd85c5ecddb86b24f6dd9e1c811ed4eca89fcc69\": container with ID starting with b830f9ecb2b69aaf59fcd543fd85c5ecddb86b24f6dd9e1c811ed4eca89fcc69 not found: ID does not exist" Apr 22 19:19:34.307370 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.307339 2576 scope.go:117] "RemoveContainer" containerID="72cb88eaa495bb4903a0431d1402b303a5748c6b28b0421361eceba4ce1068a7" Apr 22 19:19:34.314322 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.314307 2576 scope.go:117] "RemoveContainer" containerID="72cb88eaa495bb4903a0431d1402b303a5748c6b28b0421361eceba4ce1068a7" Apr 22 19:19:34.314684 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:19:34.314525 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72cb88eaa495bb4903a0431d1402b303a5748c6b28b0421361eceba4ce1068a7\": container with ID starting with 72cb88eaa495bb4903a0431d1402b303a5748c6b28b0421361eceba4ce1068a7 not found: ID does not exist" containerID="72cb88eaa495bb4903a0431d1402b303a5748c6b28b0421361eceba4ce1068a7" Apr 22 19:19:34.314684 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.314584 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72cb88eaa495bb4903a0431d1402b303a5748c6b28b0421361eceba4ce1068a7"} err="failed to get container status \"72cb88eaa495bb4903a0431d1402b303a5748c6b28b0421361eceba4ce1068a7\": rpc error: code = NotFound desc = could not find container \"72cb88eaa495bb4903a0431d1402b303a5748c6b28b0421361eceba4ce1068a7\": container with ID starting with 72cb88eaa495bb4903a0431d1402b303a5748c6b28b0421361eceba4ce1068a7 not found: ID does not exist" Apr 22 19:19:34.332813 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.332791 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-lfnd2"] Apr 22 19:19:34.340923 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.340905 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-lfnd2"] Apr 22 19:19:34.356732 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.356712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xntv8\" (UniqueName: \"kubernetes.io/projected/3cbd382c-d5a9-4521-9ff2-3dd288e1b28d-kube-api-access-xntv8\") pod \"maas-controller-6d4c8f55f9-xgnmd\" (UID: \"3cbd382c-d5a9-4521-9ff2-3dd288e1b28d\") " pod="opendatahub/maas-controller-6d4c8f55f9-xgnmd" Apr 22 19:19:34.362082 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.362064 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-675f7964d8-bvrfv"] Apr 22 19:19:34.366002 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.365986 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-675f7964d8-bvrfv"] Apr 22 19:19:34.368000 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.367977 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xntv8\" (UniqueName: \"kubernetes.io/projected/3cbd382c-d5a9-4521-9ff2-3dd288e1b28d-kube-api-access-xntv8\") pod \"maas-controller-6d4c8f55f9-xgnmd\" (UID: \"3cbd382c-d5a9-4521-9ff2-3dd288e1b28d\") " pod="opendatahub/maas-controller-6d4c8f55f9-xgnmd" Apr 22 19:19:34.401945 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.401894 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-59c9fddb94-8xp7b"] Apr 22 19:19:34.406392 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.406377 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-59c9fddb94-8xp7b" Apr 22 19:19:34.415573 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.415532 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-59c9fddb94-8xp7b"] Apr 22 19:19:34.457723 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.457684 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkkkx\" (UniqueName: \"kubernetes.io/projected/24dae771-a5cd-4af4-b3c9-6e5898fa7146-kube-api-access-fkkkx\") pod \"maas-controller-59c9fddb94-8xp7b\" (UID: \"24dae771-a5cd-4af4-b3c9-6e5898fa7146\") " pod="opendatahub/maas-controller-59c9fddb94-8xp7b" Apr 22 19:19:34.519610 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.519577 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-59c9fddb94-8xp7b"] Apr 22 19:19:34.519811 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:19:34.519790 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-fkkkx], unattached volumes=[], failed to process volumes=[]: context canceled" pod="opendatahub/maas-controller-59c9fddb94-8xp7b" podUID="24dae771-a5cd-4af4-b3c9-6e5898fa7146" Apr 22 19:19:34.549239 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.549201 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-xgnmd" Apr 22 19:19:34.551337 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.551318 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-66b776d7b6-8jq65"] Apr 22 19:19:34.554759 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.554745 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66b776d7b6-8jq65" Apr 22 19:19:34.558500 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.558480 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkkkx\" (UniqueName: \"kubernetes.io/projected/24dae771-a5cd-4af4-b3c9-6e5898fa7146-kube-api-access-fkkkx\") pod \"maas-controller-59c9fddb94-8xp7b\" (UID: \"24dae771-a5cd-4af4-b3c9-6e5898fa7146\") " pod="opendatahub/maas-controller-59c9fddb94-8xp7b" Apr 22 19:19:34.558627 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.558522 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2x8n\" (UniqueName: \"kubernetes.io/projected/388c53c2-bd6b-41dd-878a-1f98aa03fbe8-kube-api-access-d2x8n\") pod \"maas-controller-66b776d7b6-8jq65\" (UID: \"388c53c2-bd6b-41dd-878a-1f98aa03fbe8\") " pod="opendatahub/maas-controller-66b776d7b6-8jq65" Apr 22 19:19:34.566432 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.566408 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-66b776d7b6-8jq65"] Apr 22 19:19:34.572349 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.572317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkkkx\" (UniqueName: \"kubernetes.io/projected/24dae771-a5cd-4af4-b3c9-6e5898fa7146-kube-api-access-fkkkx\") pod \"maas-controller-59c9fddb94-8xp7b\" (UID: \"24dae771-a5cd-4af4-b3c9-6e5898fa7146\") " pod="opendatahub/maas-controller-59c9fddb94-8xp7b" Apr 22 19:19:34.659271 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.659239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2x8n\" (UniqueName: \"kubernetes.io/projected/388c53c2-bd6b-41dd-878a-1f98aa03fbe8-kube-api-access-d2x8n\") pod \"maas-controller-66b776d7b6-8jq65\" (UID: \"388c53c2-bd6b-41dd-878a-1f98aa03fbe8\") " pod="opendatahub/maas-controller-66b776d7b6-8jq65" Apr 22 19:19:34.666963 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.666940 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2x8n\" (UniqueName: \"kubernetes.io/projected/388c53c2-bd6b-41dd-878a-1f98aa03fbe8-kube-api-access-d2x8n\") pod \"maas-controller-66b776d7b6-8jq65\" (UID: \"388c53c2-bd6b-41dd-878a-1f98aa03fbe8\") " pod="opendatahub/maas-controller-66b776d7b6-8jq65" Apr 22 19:19:34.671263 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.671243 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-xgnmd"] Apr 22 19:19:34.672048 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:19:34.672024 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cbd382c_d5a9_4521_9ff2_3dd288e1b28d.slice/crio-1da30f6e65398adb925066d2156c0118cb9f8afb3fe112792f2621ab914a75e8 WatchSource:0}: Error finding container 1da30f6e65398adb925066d2156c0118cb9f8afb3fe112792f2621ab914a75e8: Status 404 returned error can't find the container with id 1da30f6e65398adb925066d2156c0118cb9f8afb3fe112792f2621ab914a75e8 Apr 22 19:19:34.885423 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:34.885357 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66b776d7b6-8jq65" Apr 22 19:19:35.001872 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:35.001848 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-66b776d7b6-8jq65"] Apr 22 19:19:35.003173 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:19:35.003145 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod388c53c2_bd6b_41dd_878a_1f98aa03fbe8.slice/crio-f1bc023ea4cc5ab166b6d2042ae7575d5c7f5cb8ad05eed7ffe4be30669220f1 WatchSource:0}: Error finding container f1bc023ea4cc5ab166b6d2042ae7575d5c7f5cb8ad05eed7ffe4be30669220f1: Status 404 returned error can't find the container with id f1bc023ea4cc5ab166b6d2042ae7575d5c7f5cb8ad05eed7ffe4be30669220f1 Apr 22 19:19:35.308391 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:35.308354 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66b776d7b6-8jq65" event={"ID":"388c53c2-bd6b-41dd-878a-1f98aa03fbe8","Type":"ContainerStarted","Data":"f1bc023ea4cc5ab166b6d2042ae7575d5c7f5cb8ad05eed7ffe4be30669220f1"} Apr 22 19:19:35.309754 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:35.309727 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-xgnmd" event={"ID":"3cbd382c-d5a9-4521-9ff2-3dd288e1b28d","Type":"ContainerStarted","Data":"1da30f6e65398adb925066d2156c0118cb9f8afb3fe112792f2621ab914a75e8"} Apr 22 19:19:35.310778 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:35.310752 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-59c9fddb94-8xp7b" Apr 22 19:19:35.316268 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:35.316244 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-59c9fddb94-8xp7b" Apr 22 19:19:35.364123 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:35.364099 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkkkx\" (UniqueName: \"kubernetes.io/projected/24dae771-a5cd-4af4-b3c9-6e5898fa7146-kube-api-access-fkkkx\") pod \"24dae771-a5cd-4af4-b3c9-6e5898fa7146\" (UID: \"24dae771-a5cd-4af4-b3c9-6e5898fa7146\") " Apr 22 19:19:35.366509 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:35.366470 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24dae771-a5cd-4af4-b3c9-6e5898fa7146-kube-api-access-fkkkx" (OuterVolumeSpecName: "kube-api-access-fkkkx") pod "24dae771-a5cd-4af4-b3c9-6e5898fa7146" (UID: "24dae771-a5cd-4af4-b3c9-6e5898fa7146"). InnerVolumeSpecName "kube-api-access-fkkkx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:19:35.465016 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:35.464981 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fkkkx\" (UniqueName: \"kubernetes.io/projected/24dae771-a5cd-4af4-b3c9-6e5898fa7146-kube-api-access-fkkkx\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:19:35.890691 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:35.890655 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20f00377-a013-4ce4-839a-68b91ffb01c2" path="/var/lib/kubelet/pods/20f00377-a013-4ce4-839a-68b91ffb01c2/volumes" Apr 22 19:19:35.891738 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:35.891711 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52cb2651-45ad-4a05-9134-239af4efa2f1" path="/var/lib/kubelet/pods/52cb2651-45ad-4a05-9134-239af4efa2f1/volumes" Apr 22 19:19:36.315758 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:36.315719 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-59c9fddb94-8xp7b" Apr 22 19:19:36.345813 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:36.345782 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-59c9fddb94-8xp7b"] Apr 22 19:19:36.348745 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:36.348717 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-59c9fddb94-8xp7b"] Apr 22 19:19:37.321593 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:37.321537 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-xgnmd" event={"ID":"3cbd382c-d5a9-4521-9ff2-3dd288e1b28d","Type":"ContainerStarted","Data":"8fae3fae1493d604c1e13af990fe95f466562390768206564a625aac04d5e953"} Apr 22 19:19:37.321789 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:37.321664 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-xgnmd" Apr 22 19:19:37.344893 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:37.344843 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-xgnmd" podStartSLOduration=0.821260292 podStartE2EDuration="3.344829945s" podCreationTimestamp="2026-04-22 19:19:34 +0000 UTC" firstStartedPulling="2026-04-22 19:19:34.673323625 +0000 UTC m=+799.353801458" lastFinishedPulling="2026-04-22 19:19:37.196893269 +0000 UTC m=+801.877371111" observedRunningTime="2026-04-22 19:19:37.341819215 +0000 UTC m=+802.022297071" watchObservedRunningTime="2026-04-22 19:19:37.344829945 +0000 UTC m=+802.025307801" Apr 22 19:19:37.885137 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:37.885101 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24dae771-a5cd-4af4-b3c9-6e5898fa7146" path="/var/lib/kubelet/pods/24dae771-a5cd-4af4-b3c9-6e5898fa7146/volumes" Apr 22 19:19:38.326291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:38.326249 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66b776d7b6-8jq65" event={"ID":"388c53c2-bd6b-41dd-878a-1f98aa03fbe8","Type":"ContainerStarted","Data":"1dd2151ca5c3d59cd4e29dc42b84f734a34d2811bf0ab34d64bfd3839612c146"} Apr 22 19:19:38.326502 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:38.326486 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-66b776d7b6-8jq65" Apr 22 19:19:38.347665 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:38.347624 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-66b776d7b6-8jq65" podStartSLOduration=1.5949380610000001 podStartE2EDuration="4.347610791s" podCreationTimestamp="2026-04-22 19:19:34 +0000 UTC" firstStartedPulling="2026-04-22 19:19:35.004396926 +0000 UTC m=+799.684874763" lastFinishedPulling="2026-04-22 19:19:37.757069643 +0000 UTC m=+802.437547493" observedRunningTime="2026-04-22 19:19:38.345313496 +0000 UTC m=+803.025791363" watchObservedRunningTime="2026-04-22 19:19:38.347610791 +0000 UTC m=+803.028088646" Apr 22 19:19:39.541223 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:39.541186 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-6d45f5d894-bvdbj"] Apr 22 19:19:39.547151 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:39.547128 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6d45f5d894-bvdbj" Apr 22 19:19:39.549736 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:39.549700 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 22 19:19:39.549736 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:39.549716 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 22 19:19:39.549904 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:39.549784 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-ls9pr\"" Apr 22 19:19:39.555366 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:39.555348 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6d45f5d894-bvdbj"] Apr 22 19:19:39.598889 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:39.598866 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d8bd0c42-c2ff-4b81-b28b-582ed5428220-maas-api-tls\") pod \"maas-api-6d45f5d894-bvdbj\" (UID: \"d8bd0c42-c2ff-4b81-b28b-582ed5428220\") " pod="opendatahub/maas-api-6d45f5d894-bvdbj" Apr 22 19:19:39.598999 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:39.598899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rckbj\" (UniqueName: \"kubernetes.io/projected/d8bd0c42-c2ff-4b81-b28b-582ed5428220-kube-api-access-rckbj\") pod \"maas-api-6d45f5d894-bvdbj\" (UID: \"d8bd0c42-c2ff-4b81-b28b-582ed5428220\") " pod="opendatahub/maas-api-6d45f5d894-bvdbj" Apr 22 19:19:39.700033 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:39.699989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d8bd0c42-c2ff-4b81-b28b-582ed5428220-maas-api-tls\") pod \"maas-api-6d45f5d894-bvdbj\" (UID: \"d8bd0c42-c2ff-4b81-b28b-582ed5428220\") " pod="opendatahub/maas-api-6d45f5d894-bvdbj" Apr 22 19:19:39.700164 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:39.700036 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rckbj\" (UniqueName: \"kubernetes.io/projected/d8bd0c42-c2ff-4b81-b28b-582ed5428220-kube-api-access-rckbj\") pod \"maas-api-6d45f5d894-bvdbj\" (UID: \"d8bd0c42-c2ff-4b81-b28b-582ed5428220\") " pod="opendatahub/maas-api-6d45f5d894-bvdbj" Apr 22 19:19:39.700164 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:19:39.700128 2576 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 22 19:19:39.700268 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:19:39.700187 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bd0c42-c2ff-4b81-b28b-582ed5428220-maas-api-tls podName:d8bd0c42-c2ff-4b81-b28b-582ed5428220 nodeName:}" failed. No retries permitted until 2026-04-22 19:19:40.20017159 +0000 UTC m=+804.880649423 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/d8bd0c42-c2ff-4b81-b28b-582ed5428220-maas-api-tls") pod "maas-api-6d45f5d894-bvdbj" (UID: "d8bd0c42-c2ff-4b81-b28b-582ed5428220") : secret "maas-api-serving-cert" not found Apr 22 19:19:39.708932 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:39.708912 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rckbj\" (UniqueName: \"kubernetes.io/projected/d8bd0c42-c2ff-4b81-b28b-582ed5428220-kube-api-access-rckbj\") pod \"maas-api-6d45f5d894-bvdbj\" (UID: \"d8bd0c42-c2ff-4b81-b28b-582ed5428220\") " pod="opendatahub/maas-api-6d45f5d894-bvdbj" Apr 22 19:19:40.203618 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:40.203563 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d8bd0c42-c2ff-4b81-b28b-582ed5428220-maas-api-tls\") pod \"maas-api-6d45f5d894-bvdbj\" (UID: \"d8bd0c42-c2ff-4b81-b28b-582ed5428220\") " pod="opendatahub/maas-api-6d45f5d894-bvdbj" Apr 22 19:19:40.205873 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:40.205844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d8bd0c42-c2ff-4b81-b28b-582ed5428220-maas-api-tls\") pod \"maas-api-6d45f5d894-bvdbj\" (UID: \"d8bd0c42-c2ff-4b81-b28b-582ed5428220\") " pod="opendatahub/maas-api-6d45f5d894-bvdbj" Apr 22 19:19:40.459341 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:40.459271 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6d45f5d894-bvdbj" Apr 22 19:19:40.592815 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:40.592791 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6d45f5d894-bvdbj"] Apr 22 19:19:40.595187 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:19:40.595153 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8bd0c42_c2ff_4b81_b28b_582ed5428220.slice/crio-6bca03cceae1a5ebeb179f1a7de5dda9ca79e56c592311e2b11030ae44d1009b WatchSource:0}: Error finding container 6bca03cceae1a5ebeb179f1a7de5dda9ca79e56c592311e2b11030ae44d1009b: Status 404 returned error can't find the container with id 6bca03cceae1a5ebeb179f1a7de5dda9ca79e56c592311e2b11030ae44d1009b Apr 22 19:19:41.339741 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:41.339709 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6d45f5d894-bvdbj" event={"ID":"d8bd0c42-c2ff-4b81-b28b-582ed5428220","Type":"ContainerStarted","Data":"6bca03cceae1a5ebeb179f1a7de5dda9ca79e56c592311e2b11030ae44d1009b"} Apr 22 19:19:42.344045 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:42.344013 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6d45f5d894-bvdbj" event={"ID":"d8bd0c42-c2ff-4b81-b28b-582ed5428220","Type":"ContainerStarted","Data":"08ddd2c422a974d0e80ef16c1c64f950f497c1fe42731a636b8f843aebd8ca09"} Apr 22 19:19:42.344412 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:42.344151 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-6d45f5d894-bvdbj" Apr 22 19:19:42.364624 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:42.362870 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-6d45f5d894-bvdbj" podStartSLOduration=1.859941794 podStartE2EDuration="3.362850871s" podCreationTimestamp="2026-04-22 19:19:39 +0000 UTC" firstStartedPulling="2026-04-22 19:19:40.596510484 +0000 UTC m=+805.276988320" lastFinishedPulling="2026-04-22 19:19:42.099419564 +0000 UTC m=+806.779897397" observedRunningTime="2026-04-22 19:19:42.360795509 +0000 UTC m=+807.041273364" watchObservedRunningTime="2026-04-22 19:19:42.362850871 +0000 UTC m=+807.043328727" Apr 22 19:19:48.331026 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:48.330996 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d4c8f55f9-xgnmd" Apr 22 19:19:48.352598 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:48.352575 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-6d45f5d894-bvdbj" Apr 22 19:19:49.336302 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:49.336274 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-66b776d7b6-8jq65" Apr 22 19:19:49.379652 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:49.379627 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-xgnmd"] Apr 22 19:19:49.379821 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:49.379791 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-xgnmd" podUID="3cbd382c-d5a9-4521-9ff2-3dd288e1b28d" containerName="manager" containerID="cri-o://8fae3fae1493d604c1e13af990fe95f466562390768206564a625aac04d5e953" gracePeriod=10 Apr 22 19:19:49.617746 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:49.617724 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-xgnmd" Apr 22 19:19:49.671939 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:49.671910 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xntv8\" (UniqueName: \"kubernetes.io/projected/3cbd382c-d5a9-4521-9ff2-3dd288e1b28d-kube-api-access-xntv8\") pod \"3cbd382c-d5a9-4521-9ff2-3dd288e1b28d\" (UID: \"3cbd382c-d5a9-4521-9ff2-3dd288e1b28d\") " Apr 22 19:19:49.673966 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:49.673939 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cbd382c-d5a9-4521-9ff2-3dd288e1b28d-kube-api-access-xntv8" (OuterVolumeSpecName: "kube-api-access-xntv8") pod "3cbd382c-d5a9-4521-9ff2-3dd288e1b28d" (UID: "3cbd382c-d5a9-4521-9ff2-3dd288e1b28d"). InnerVolumeSpecName "kube-api-access-xntv8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:19:49.691205 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:49.691183 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6dcdf69c54-tkw65"] Apr 22 19:19:49.691493 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:49.691482 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3cbd382c-d5a9-4521-9ff2-3dd288e1b28d" containerName="manager" Apr 22 19:19:49.691531 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:49.691495 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cbd382c-d5a9-4521-9ff2-3dd288e1b28d" containerName="manager" Apr 22 19:19:49.691579 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:49.691564 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3cbd382c-d5a9-4521-9ff2-3dd288e1b28d" containerName="manager" Apr 22 19:19:49.695024 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:49.695009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6dcdf69c54-tkw65" Apr 22 19:19:49.707958 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:49.707934 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6dcdf69c54-tkw65"] Apr 22 19:19:49.772448 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:49.772422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xskk\" (UniqueName: \"kubernetes.io/projected/c258c08c-38d0-4961-a2f5-f6ff503f09af-kube-api-access-7xskk\") pod \"maas-controller-6dcdf69c54-tkw65\" (UID: \"c258c08c-38d0-4961-a2f5-f6ff503f09af\") " pod="opendatahub/maas-controller-6dcdf69c54-tkw65" Apr 22 19:19:49.772576 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:49.772532 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xntv8\" (UniqueName: \"kubernetes.io/projected/3cbd382c-d5a9-4521-9ff2-3dd288e1b28d-kube-api-access-xntv8\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:19:49.873151 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:49.873092 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xskk\" (UniqueName: \"kubernetes.io/projected/c258c08c-38d0-4961-a2f5-f6ff503f09af-kube-api-access-7xskk\") pod \"maas-controller-6dcdf69c54-tkw65\" (UID: \"c258c08c-38d0-4961-a2f5-f6ff503f09af\") " pod="opendatahub/maas-controller-6dcdf69c54-tkw65" Apr 22 19:19:49.882125 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:49.882103 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xskk\" (UniqueName: \"kubernetes.io/projected/c258c08c-38d0-4961-a2f5-f6ff503f09af-kube-api-access-7xskk\") pod \"maas-controller-6dcdf69c54-tkw65\" (UID: \"c258c08c-38d0-4961-a2f5-f6ff503f09af\") " pod="opendatahub/maas-controller-6dcdf69c54-tkw65" Apr 22 19:19:50.004912 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:50.004879 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6dcdf69c54-tkw65" Apr 22 19:19:50.331756 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:50.331732 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6dcdf69c54-tkw65"] Apr 22 19:19:50.333396 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:19:50.333369 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc258c08c_38d0_4961_a2f5_f6ff503f09af.slice/crio-5cb724eeaa32a86cb89b7de6181d0082f6938382fb6e17dec40e173dc06a8b0d WatchSource:0}: Error finding container 5cb724eeaa32a86cb89b7de6181d0082f6938382fb6e17dec40e173dc06a8b0d: Status 404 returned error can't find the container with id 5cb724eeaa32a86cb89b7de6181d0082f6938382fb6e17dec40e173dc06a8b0d Apr 22 19:19:50.377929 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:50.377894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6dcdf69c54-tkw65" event={"ID":"c258c08c-38d0-4961-a2f5-f6ff503f09af","Type":"ContainerStarted","Data":"5cb724eeaa32a86cb89b7de6181d0082f6938382fb6e17dec40e173dc06a8b0d"} Apr 22 19:19:50.379069 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:50.379048 2576 generic.go:358] "Generic (PLEG): container finished" podID="3cbd382c-d5a9-4521-9ff2-3dd288e1b28d" containerID="8fae3fae1493d604c1e13af990fe95f466562390768206564a625aac04d5e953" exitCode=0 Apr 22 19:19:50.379158 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:50.379126 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-xgnmd" event={"ID":"3cbd382c-d5a9-4521-9ff2-3dd288e1b28d","Type":"ContainerDied","Data":"8fae3fae1493d604c1e13af990fe95f466562390768206564a625aac04d5e953"} Apr 22 19:19:50.379158 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:50.379134 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-xgnmd" Apr 22 19:19:50.379158 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:50.379151 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-xgnmd" event={"ID":"3cbd382c-d5a9-4521-9ff2-3dd288e1b28d","Type":"ContainerDied","Data":"1da30f6e65398adb925066d2156c0118cb9f8afb3fe112792f2621ab914a75e8"} Apr 22 19:19:50.379262 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:50.379169 2576 scope.go:117] "RemoveContainer" containerID="8fae3fae1493d604c1e13af990fe95f466562390768206564a625aac04d5e953" Apr 22 19:19:50.387388 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:50.387371 2576 scope.go:117] "RemoveContainer" containerID="8fae3fae1493d604c1e13af990fe95f466562390768206564a625aac04d5e953" Apr 22 19:19:50.387646 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:19:50.387627 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fae3fae1493d604c1e13af990fe95f466562390768206564a625aac04d5e953\": container with ID starting with 8fae3fae1493d604c1e13af990fe95f466562390768206564a625aac04d5e953 not found: ID does not exist" containerID="8fae3fae1493d604c1e13af990fe95f466562390768206564a625aac04d5e953" Apr 22 19:19:50.387706 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:50.387657 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fae3fae1493d604c1e13af990fe95f466562390768206564a625aac04d5e953"} err="failed to get container status \"8fae3fae1493d604c1e13af990fe95f466562390768206564a625aac04d5e953\": rpc error: code = NotFound desc = could not find container \"8fae3fae1493d604c1e13af990fe95f466562390768206564a625aac04d5e953\": container with ID starting with 8fae3fae1493d604c1e13af990fe95f466562390768206564a625aac04d5e953 not found: ID does not exist" Apr 22 19:19:50.400355 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:50.400328 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-xgnmd"] Apr 22 19:19:50.404464 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:50.404441 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-xgnmd"] Apr 22 19:19:51.385009 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:51.384973 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6dcdf69c54-tkw65" event={"ID":"c258c08c-38d0-4961-a2f5-f6ff503f09af","Type":"ContainerStarted","Data":"86b930e37b133c9094b17a621782a46c2b5b1335ddb09332debfada0cdef75d4"} Apr 22 19:19:51.385400 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:51.385134 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6dcdf69c54-tkw65" Apr 22 19:19:51.407672 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:51.407618 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6dcdf69c54-tkw65" podStartSLOduration=1.913589934 podStartE2EDuration="2.407605126s" podCreationTimestamp="2026-04-22 19:19:49 +0000 UTC" firstStartedPulling="2026-04-22 19:19:50.334684052 +0000 UTC m=+815.015161886" lastFinishedPulling="2026-04-22 19:19:50.82869923 +0000 UTC m=+815.509177078" observedRunningTime="2026-04-22 19:19:51.405339613 +0000 UTC m=+816.085817468" watchObservedRunningTime="2026-04-22 19:19:51.407605126 +0000 UTC m=+816.088082980" Apr 22 19:19:51.884212 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:19:51.884180 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cbd382c-d5a9-4521-9ff2-3dd288e1b28d" path="/var/lib/kubelet/pods/3cbd382c-d5a9-4521-9ff2-3dd288e1b28d/volumes" Apr 22 19:20:02.394339 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:02.394310 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6dcdf69c54-tkw65" Apr 22 19:20:02.444417 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:02.444382 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-66b776d7b6-8jq65"] Apr 22 19:20:02.444655 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:02.444632 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-66b776d7b6-8jq65" podUID="388c53c2-bd6b-41dd-878a-1f98aa03fbe8" containerName="manager" containerID="cri-o://1dd2151ca5c3d59cd4e29dc42b84f734a34d2811bf0ab34d64bfd3839612c146" gracePeriod=10 Apr 22 19:20:02.682944 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:02.682924 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66b776d7b6-8jq65" Apr 22 19:20:02.768829 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:02.768802 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2x8n\" (UniqueName: \"kubernetes.io/projected/388c53c2-bd6b-41dd-878a-1f98aa03fbe8-kube-api-access-d2x8n\") pod \"388c53c2-bd6b-41dd-878a-1f98aa03fbe8\" (UID: \"388c53c2-bd6b-41dd-878a-1f98aa03fbe8\") " Apr 22 19:20:02.770682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:02.770655 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388c53c2-bd6b-41dd-878a-1f98aa03fbe8-kube-api-access-d2x8n" (OuterVolumeSpecName: "kube-api-access-d2x8n") pod "388c53c2-bd6b-41dd-878a-1f98aa03fbe8" (UID: "388c53c2-bd6b-41dd-878a-1f98aa03fbe8"). InnerVolumeSpecName "kube-api-access-d2x8n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:20:02.870017 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:02.869990 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d2x8n\" (UniqueName: \"kubernetes.io/projected/388c53c2-bd6b-41dd-878a-1f98aa03fbe8-kube-api-access-d2x8n\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:20:03.432749 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:03.432719 2576 generic.go:358] "Generic (PLEG): container finished" podID="388c53c2-bd6b-41dd-878a-1f98aa03fbe8" containerID="1dd2151ca5c3d59cd4e29dc42b84f734a34d2811bf0ab34d64bfd3839612c146" exitCode=0 Apr 22 19:20:03.433135 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:03.432800 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66b776d7b6-8jq65" Apr 22 19:20:03.433135 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:03.432801 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66b776d7b6-8jq65" event={"ID":"388c53c2-bd6b-41dd-878a-1f98aa03fbe8","Type":"ContainerDied","Data":"1dd2151ca5c3d59cd4e29dc42b84f734a34d2811bf0ab34d64bfd3839612c146"} Apr 22 19:20:03.433135 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:03.432906 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66b776d7b6-8jq65" event={"ID":"388c53c2-bd6b-41dd-878a-1f98aa03fbe8","Type":"ContainerDied","Data":"f1bc023ea4cc5ab166b6d2042ae7575d5c7f5cb8ad05eed7ffe4be30669220f1"} Apr 22 19:20:03.433135 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:03.432921 2576 scope.go:117] "RemoveContainer" containerID="1dd2151ca5c3d59cd4e29dc42b84f734a34d2811bf0ab34d64bfd3839612c146" Apr 22 19:20:03.441085 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:03.441068 2576 scope.go:117] "RemoveContainer" containerID="1dd2151ca5c3d59cd4e29dc42b84f734a34d2811bf0ab34d64bfd3839612c146" Apr 22 19:20:03.441313 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:20:03.441286 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dd2151ca5c3d59cd4e29dc42b84f734a34d2811bf0ab34d64bfd3839612c146\": container with ID starting with 1dd2151ca5c3d59cd4e29dc42b84f734a34d2811bf0ab34d64bfd3839612c146 not found: ID does not exist" containerID="1dd2151ca5c3d59cd4e29dc42b84f734a34d2811bf0ab34d64bfd3839612c146" Apr 22 19:20:03.441382 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:03.441319 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dd2151ca5c3d59cd4e29dc42b84f734a34d2811bf0ab34d64bfd3839612c146"} err="failed to get container status \"1dd2151ca5c3d59cd4e29dc42b84f734a34d2811bf0ab34d64bfd3839612c146\": rpc error: code = NotFound desc = could not find container \"1dd2151ca5c3d59cd4e29dc42b84f734a34d2811bf0ab34d64bfd3839612c146\": container with ID starting with 1dd2151ca5c3d59cd4e29dc42b84f734a34d2811bf0ab34d64bfd3839612c146 not found: ID does not exist" Apr 22 19:20:03.469096 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:03.469069 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-66b776d7b6-8jq65"] Apr 22 19:20:03.482774 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:03.482752 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-66b776d7b6-8jq65"] Apr 22 19:20:03.883800 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:03.883773 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="388c53c2-bd6b-41dd-878a-1f98aa03fbe8" path="/var/lib/kubelet/pods/388c53c2-bd6b-41dd-878a-1f98aa03fbe8/volumes" Apr 22 19:20:09.007107 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:09.007065 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-cbb586f7c-bj2br"] Apr 22 19:20:09.007669 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:09.007436 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="388c53c2-bd6b-41dd-878a-1f98aa03fbe8" containerName="manager" Apr 22 19:20:09.007669 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:09.007450 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="388c53c2-bd6b-41dd-878a-1f98aa03fbe8" containerName="manager" Apr 22 19:20:09.007669 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:09.007566 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="388c53c2-bd6b-41dd-878a-1f98aa03fbe8" containerName="manager" Apr 22 19:20:09.062619 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:09.062575 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-cbb586f7c-bj2br"] Apr 22 19:20:09.062780 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:09.062710 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-cbb586f7c-bj2br" Apr 22 19:20:09.116715 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:09.116688 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdtlx\" (UniqueName: \"kubernetes.io/projected/910d0f8d-0439-4b24-9666-7335d35d534b-kube-api-access-qdtlx\") pod \"maas-api-cbb586f7c-bj2br\" (UID: \"910d0f8d-0439-4b24-9666-7335d35d534b\") " pod="opendatahub/maas-api-cbb586f7c-bj2br" Apr 22 19:20:09.116842 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:09.116721 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/910d0f8d-0439-4b24-9666-7335d35d534b-maas-api-tls\") pod \"maas-api-cbb586f7c-bj2br\" (UID: \"910d0f8d-0439-4b24-9666-7335d35d534b\") " pod="opendatahub/maas-api-cbb586f7c-bj2br" Apr 22 19:20:09.217794 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:09.217753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdtlx\" (UniqueName: \"kubernetes.io/projected/910d0f8d-0439-4b24-9666-7335d35d534b-kube-api-access-qdtlx\") pod \"maas-api-cbb586f7c-bj2br\" (UID: \"910d0f8d-0439-4b24-9666-7335d35d534b\") " pod="opendatahub/maas-api-cbb586f7c-bj2br" Apr 22 19:20:09.217944 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:09.217801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/910d0f8d-0439-4b24-9666-7335d35d534b-maas-api-tls\") pod \"maas-api-cbb586f7c-bj2br\" (UID: \"910d0f8d-0439-4b24-9666-7335d35d534b\") " pod="opendatahub/maas-api-cbb586f7c-bj2br" Apr 22 19:20:09.220371 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:09.220342 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/910d0f8d-0439-4b24-9666-7335d35d534b-maas-api-tls\") pod \"maas-api-cbb586f7c-bj2br\" (UID: \"910d0f8d-0439-4b24-9666-7335d35d534b\") " pod="opendatahub/maas-api-cbb586f7c-bj2br" Apr 22 19:20:09.231114 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:09.231092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdtlx\" (UniqueName: \"kubernetes.io/projected/910d0f8d-0439-4b24-9666-7335d35d534b-kube-api-access-qdtlx\") pod \"maas-api-cbb586f7c-bj2br\" (UID: \"910d0f8d-0439-4b24-9666-7335d35d534b\") " pod="opendatahub/maas-api-cbb586f7c-bj2br" Apr 22 19:20:09.373462 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:09.373394 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-cbb586f7c-bj2br" Apr 22 19:20:09.503055 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:09.503025 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-cbb586f7c-bj2br"] Apr 22 19:20:09.503845 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:20:09.503815 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod910d0f8d_0439_4b24_9666_7335d35d534b.slice/crio-823392fb659750a06908bd9f66c246715c7385d8c6ff75542246eb954023571c WatchSource:0}: Error finding container 823392fb659750a06908bd9f66c246715c7385d8c6ff75542246eb954023571c: Status 404 returned error can't find the container with id 823392fb659750a06908bd9f66c246715c7385d8c6ff75542246eb954023571c Apr 22 19:20:09.505101 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:09.505086 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:20:10.465446 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:10.465408 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-cbb586f7c-bj2br" event={"ID":"910d0f8d-0439-4b24-9666-7335d35d534b","Type":"ContainerStarted","Data":"823392fb659750a06908bd9f66c246715c7385d8c6ff75542246eb954023571c"} Apr 22 19:20:12.474154 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:12.474121 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-cbb586f7c-bj2br" event={"ID":"910d0f8d-0439-4b24-9666-7335d35d534b","Type":"ContainerStarted","Data":"8b0a98816c7954dd310b27c65dc20d90b8a34940996f2a825f4f59b8ae60d245"} Apr 22 19:20:12.474527 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:12.474294 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-cbb586f7c-bj2br" Apr 22 19:20:12.495169 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:12.495110 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-cbb586f7c-bj2br" podStartSLOduration=2.259881383 podStartE2EDuration="4.495097949s" podCreationTimestamp="2026-04-22 19:20:08 +0000 UTC" firstStartedPulling="2026-04-22 19:20:09.505199363 +0000 UTC m=+834.185677196" lastFinishedPulling="2026-04-22 19:20:11.740415929 +0000 UTC m=+836.420893762" observedRunningTime="2026-04-22 19:20:12.491985195 +0000 UTC m=+837.172463060" watchObservedRunningTime="2026-04-22 19:20:12.495097949 +0000 UTC m=+837.175575848" Apr 22 19:20:14.712146 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.712115 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9"] Apr 22 19:20:14.716351 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.716334 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.718830 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.718802 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 22 19:20:14.718830 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.718819 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 22 19:20:14.718830 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.718826 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 22 19:20:14.719583 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.719567 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-rhw56\"" Apr 22 19:20:14.725376 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.725355 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9"] Apr 22 19:20:14.860118 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.860089 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/157e30fa-f8ef-4f33-aec5-c34208a28d25-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.860256 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.860123 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/157e30fa-f8ef-4f33-aec5-c34208a28d25-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.860256 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.860154 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwml5\" (UniqueName: \"kubernetes.io/projected/157e30fa-f8ef-4f33-aec5-c34208a28d25-kube-api-access-pwml5\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.860256 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.860174 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/157e30fa-f8ef-4f33-aec5-c34208a28d25-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.860256 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.860192 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/157e30fa-f8ef-4f33-aec5-c34208a28d25-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.860256 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.860217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/157e30fa-f8ef-4f33-aec5-c34208a28d25-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.960923 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.960890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/157e30fa-f8ef-4f33-aec5-c34208a28d25-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.960923 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.960926 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/157e30fa-f8ef-4f33-aec5-c34208a28d25-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.961144 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.961073 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwml5\" (UniqueName: \"kubernetes.io/projected/157e30fa-f8ef-4f33-aec5-c34208a28d25-kube-api-access-pwml5\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.961144 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.961121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/157e30fa-f8ef-4f33-aec5-c34208a28d25-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.961250 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.961162 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/157e30fa-f8ef-4f33-aec5-c34208a28d25-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.961250 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.961231 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/157e30fa-f8ef-4f33-aec5-c34208a28d25-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.961374 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.961347 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/157e30fa-f8ef-4f33-aec5-c34208a28d25-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.961461 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.961439 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/157e30fa-f8ef-4f33-aec5-c34208a28d25-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.961639 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.961618 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/157e30fa-f8ef-4f33-aec5-c34208a28d25-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.963196 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.963139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/157e30fa-f8ef-4f33-aec5-c34208a28d25-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.963425 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.963406 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/157e30fa-f8ef-4f33-aec5-c34208a28d25-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:14.968653 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:14.968630 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwml5\" (UniqueName: \"kubernetes.io/projected/157e30fa-f8ef-4f33-aec5-c34208a28d25-kube-api-access-pwml5\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9\" (UID: \"157e30fa-f8ef-4f33-aec5-c34208a28d25\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:15.026973 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:15.026950 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:15.166001 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:15.165973 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9"] Apr 22 19:20:15.166589 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:20:15.166541 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod157e30fa_f8ef_4f33_aec5_c34208a28d25.slice/crio-ea1982f6f9bfc630079e203e560ed8133410e4ef648f59f49a47e777839d60ed WatchSource:0}: Error finding container ea1982f6f9bfc630079e203e560ed8133410e4ef648f59f49a47e777839d60ed: Status 404 returned error can't find the container with id ea1982f6f9bfc630079e203e560ed8133410e4ef648f59f49a47e777839d60ed Apr 22 19:20:15.486612 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:15.486577 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" event={"ID":"157e30fa-f8ef-4f33-aec5-c34208a28d25","Type":"ContainerStarted","Data":"ea1982f6f9bfc630079e203e560ed8133410e4ef648f59f49a47e777839d60ed"} Apr 22 19:20:18.484140 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:18.484109 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-cbb586f7c-bj2br" Apr 22 19:20:18.533575 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:18.533507 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-6d45f5d894-bvdbj"] Apr 22 19:20:18.533860 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:18.533829 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-6d45f5d894-bvdbj" podUID="d8bd0c42-c2ff-4b81-b28b-582ed5428220" containerName="maas-api" containerID="cri-o://08ddd2c422a974d0e80ef16c1c64f950f497c1fe42731a636b8f843aebd8ca09" gracePeriod=30 Apr 22 19:20:18.780676 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:18.780649 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6d45f5d894-bvdbj" Apr 22 19:20:18.895129 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:18.895100 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rckbj\" (UniqueName: \"kubernetes.io/projected/d8bd0c42-c2ff-4b81-b28b-582ed5428220-kube-api-access-rckbj\") pod \"d8bd0c42-c2ff-4b81-b28b-582ed5428220\" (UID: \"d8bd0c42-c2ff-4b81-b28b-582ed5428220\") " Apr 22 19:20:18.895310 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:18.895209 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d8bd0c42-c2ff-4b81-b28b-582ed5428220-maas-api-tls\") pod \"d8bd0c42-c2ff-4b81-b28b-582ed5428220\" (UID: \"d8bd0c42-c2ff-4b81-b28b-582ed5428220\") " Apr 22 19:20:18.897412 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:18.897365 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8bd0c42-c2ff-4b81-b28b-582ed5428220-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "d8bd0c42-c2ff-4b81-b28b-582ed5428220" (UID: "d8bd0c42-c2ff-4b81-b28b-582ed5428220"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:20:18.897526 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:18.897451 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8bd0c42-c2ff-4b81-b28b-582ed5428220-kube-api-access-rckbj" (OuterVolumeSpecName: "kube-api-access-rckbj") pod "d8bd0c42-c2ff-4b81-b28b-582ed5428220" (UID: "d8bd0c42-c2ff-4b81-b28b-582ed5428220"). InnerVolumeSpecName "kube-api-access-rckbj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:20:18.996225 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:18.996200 2576 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d8bd0c42-c2ff-4b81-b28b-582ed5428220-maas-api-tls\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:20:18.996225 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:18.996223 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rckbj\" (UniqueName: \"kubernetes.io/projected/d8bd0c42-c2ff-4b81-b28b-582ed5428220-kube-api-access-rckbj\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:20:19.511481 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:19.511431 2576 generic.go:358] "Generic (PLEG): container finished" podID="d8bd0c42-c2ff-4b81-b28b-582ed5428220" containerID="08ddd2c422a974d0e80ef16c1c64f950f497c1fe42731a636b8f843aebd8ca09" exitCode=0 Apr 22 19:20:19.511942 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:19.511512 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6d45f5d894-bvdbj" event={"ID":"d8bd0c42-c2ff-4b81-b28b-582ed5428220","Type":"ContainerDied","Data":"08ddd2c422a974d0e80ef16c1c64f950f497c1fe42731a636b8f843aebd8ca09"} Apr 22 19:20:19.511942 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:19.511530 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6d45f5d894-bvdbj" Apr 22 19:20:19.511942 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:19.511578 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6d45f5d894-bvdbj" event={"ID":"d8bd0c42-c2ff-4b81-b28b-582ed5428220","Type":"ContainerDied","Data":"6bca03cceae1a5ebeb179f1a7de5dda9ca79e56c592311e2b11030ae44d1009b"} Apr 22 19:20:19.511942 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:19.511603 2576 scope.go:117] "RemoveContainer" containerID="08ddd2c422a974d0e80ef16c1c64f950f497c1fe42731a636b8f843aebd8ca09" Apr 22 19:20:19.522727 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:19.522703 2576 scope.go:117] "RemoveContainer" containerID="08ddd2c422a974d0e80ef16c1c64f950f497c1fe42731a636b8f843aebd8ca09" Apr 22 19:20:19.523013 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:20:19.522987 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08ddd2c422a974d0e80ef16c1c64f950f497c1fe42731a636b8f843aebd8ca09\": container with ID starting with 08ddd2c422a974d0e80ef16c1c64f950f497c1fe42731a636b8f843aebd8ca09 not found: ID does not exist" containerID="08ddd2c422a974d0e80ef16c1c64f950f497c1fe42731a636b8f843aebd8ca09" Apr 22 19:20:19.523113 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:19.523020 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ddd2c422a974d0e80ef16c1c64f950f497c1fe42731a636b8f843aebd8ca09"} err="failed to get container status \"08ddd2c422a974d0e80ef16c1c64f950f497c1fe42731a636b8f843aebd8ca09\": rpc error: code = NotFound desc = could not find container \"08ddd2c422a974d0e80ef16c1c64f950f497c1fe42731a636b8f843aebd8ca09\": container with ID starting with 08ddd2c422a974d0e80ef16c1c64f950f497c1fe42731a636b8f843aebd8ca09 not found: ID does not exist" Apr 22 19:20:19.539307 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:19.539278 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-6d45f5d894-bvdbj"] Apr 22 19:20:19.543672 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:19.543647 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-6d45f5d894-bvdbj"] Apr 22 19:20:19.884367 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:19.884285 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8bd0c42-c2ff-4b81-b28b-582ed5428220" path="/var/lib/kubelet/pods/d8bd0c42-c2ff-4b81-b28b-582ed5428220/volumes" Apr 22 19:20:23.531137 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:23.531098 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" event={"ID":"157e30fa-f8ef-4f33-aec5-c34208a28d25","Type":"ContainerStarted","Data":"ae3b6ff845522c6a47645504fb2e1f7ab2343b9ad786d76f21c456455f2463b3"} Apr 22 19:20:26.253303 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.253262 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl"] Apr 22 19:20:26.253789 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.253622 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8bd0c42-c2ff-4b81-b28b-582ed5428220" containerName="maas-api" Apr 22 19:20:26.253789 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.253634 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bd0c42-c2ff-4b81-b28b-582ed5428220" containerName="maas-api" Apr 22 19:20:26.253789 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.253704 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d8bd0c42-c2ff-4b81-b28b-582ed5428220" containerName="maas-api" Apr 22 19:20:26.257031 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.257006 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.260040 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.260015 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 22 19:20:26.277531 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.277507 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl"] Apr 22 19:20:26.360341 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.360308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c134dff1-bc78-42ea-b047-1d549a82f049-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.360341 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.360342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c134dff1-bc78-42ea-b047-1d549a82f049-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.360530 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.360379 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c134dff1-bc78-42ea-b047-1d549a82f049-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.360530 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.360405 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c134dff1-bc78-42ea-b047-1d549a82f049-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.360530 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.360447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpgnt\" (UniqueName: \"kubernetes.io/projected/c134dff1-bc78-42ea-b047-1d549a82f049-kube-api-access-wpgnt\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.360530 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.360468 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c134dff1-bc78-42ea-b047-1d549a82f049-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.461706 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.461670 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpgnt\" (UniqueName: \"kubernetes.io/projected/c134dff1-bc78-42ea-b047-1d549a82f049-kube-api-access-wpgnt\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.461865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.461718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c134dff1-bc78-42ea-b047-1d549a82f049-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.461865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.461784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c134dff1-bc78-42ea-b047-1d549a82f049-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.461865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.461805 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c134dff1-bc78-42ea-b047-1d549a82f049-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.461865 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.461850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c134dff1-bc78-42ea-b047-1d549a82f049-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.462042 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.461870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c134dff1-bc78-42ea-b047-1d549a82f049-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.462188 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.462167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c134dff1-bc78-42ea-b047-1d549a82f049-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.462262 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.462239 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c134dff1-bc78-42ea-b047-1d549a82f049-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.462324 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.462299 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c134dff1-bc78-42ea-b047-1d549a82f049-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.464147 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.464121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c134dff1-bc78-42ea-b047-1d549a82f049-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.464405 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.464388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c134dff1-bc78-42ea-b047-1d549a82f049-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.471130 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.471103 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpgnt\" (UniqueName: \"kubernetes.io/projected/c134dff1-bc78-42ea-b047-1d549a82f049-kube-api-access-wpgnt\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl\" (UID: \"c134dff1-bc78-42ea-b047-1d549a82f049\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.567901 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.567834 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:26.709909 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:26.709886 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl"] Apr 22 19:20:26.711596 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:20:26.711563 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc134dff1_bc78_42ea_b047_1d549a82f049.slice/crio-56908ab4fcb097067be8f7ea4b2cc52bce681a6b5e15d2baa53b5c2628ca7ca3 WatchSource:0}: Error finding container 56908ab4fcb097067be8f7ea4b2cc52bce681a6b5e15d2baa53b5c2628ca7ca3: Status 404 returned error can't find the container with id 56908ab4fcb097067be8f7ea4b2cc52bce681a6b5e15d2baa53b5c2628ca7ca3 Apr 22 19:20:27.549125 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:27.549093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" event={"ID":"c134dff1-bc78-42ea-b047-1d549a82f049","Type":"ContainerStarted","Data":"34beb94bd944f7ff95b95656a8829f8c5c621884329c35afca2ce883000df908"} Apr 22 19:20:27.549125 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:27.549128 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" event={"ID":"c134dff1-bc78-42ea-b047-1d549a82f049","Type":"ContainerStarted","Data":"56908ab4fcb097067be8f7ea4b2cc52bce681a6b5e15d2baa53b5c2628ca7ca3"} Apr 22 19:20:28.553846 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:28.553815 2576 generic.go:358] "Generic (PLEG): container finished" podID="157e30fa-f8ef-4f33-aec5-c34208a28d25" containerID="ae3b6ff845522c6a47645504fb2e1f7ab2343b9ad786d76f21c456455f2463b3" exitCode=0 Apr 22 19:20:28.554200 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:28.553895 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" event={"ID":"157e30fa-f8ef-4f33-aec5-c34208a28d25","Type":"ContainerDied","Data":"ae3b6ff845522c6a47645504fb2e1f7ab2343b9ad786d76f21c456455f2463b3"} Apr 22 19:20:30.566536 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:30.566496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" event={"ID":"157e30fa-f8ef-4f33-aec5-c34208a28d25","Type":"ContainerStarted","Data":"edf99cf10cf3828629c4c72f1c5522d4d5023632182fac6b5ace97bf552fa94a"} Apr 22 19:20:30.566982 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:30.566839 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:30.590377 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:30.590333 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" podStartSLOduration=2.000125924 podStartE2EDuration="16.590318267s" podCreationTimestamp="2026-04-22 19:20:14 +0000 UTC" firstStartedPulling="2026-04-22 19:20:15.168279243 +0000 UTC m=+839.848757079" lastFinishedPulling="2026-04-22 19:20:29.758471574 +0000 UTC m=+854.438949422" observedRunningTime="2026-04-22 19:20:30.587244822 +0000 UTC m=+855.267722677" watchObservedRunningTime="2026-04-22 19:20:30.590318267 +0000 UTC m=+855.270796123" Apr 22 19:20:32.575612 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:32.575583 2576 generic.go:358] "Generic (PLEG): container finished" podID="c134dff1-bc78-42ea-b047-1d549a82f049" containerID="34beb94bd944f7ff95b95656a8829f8c5c621884329c35afca2ce883000df908" exitCode=0 Apr 22 19:20:32.575987 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:32.575657 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" event={"ID":"c134dff1-bc78-42ea-b047-1d549a82f049","Type":"ContainerDied","Data":"34beb94bd944f7ff95b95656a8829f8c5c621884329c35afca2ce883000df908"} Apr 22 19:20:33.004773 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.004742 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp"] Apr 22 19:20:33.007474 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.007457 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.010532 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.010511 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 22 19:20:33.020822 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.020803 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp"] Apr 22 19:20:33.118074 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.118006 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a9c4f42f-35aa-4221-baec-ef1742d731ab-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.118074 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.118058 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a9c4f42f-35aa-4221-baec-ef1742d731ab-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.118242 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.118098 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9c4f42f-35aa-4221-baec-ef1742d731ab-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.118242 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.118120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a9c4f42f-35aa-4221-baec-ef1742d731ab-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.118242 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.118140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9c4f42f-35aa-4221-baec-ef1742d731ab-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.118242 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.118193 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgzf6\" (UniqueName: \"kubernetes.io/projected/a9c4f42f-35aa-4221-baec-ef1742d731ab-kube-api-access-vgzf6\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.219436 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.219406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a9c4f42f-35aa-4221-baec-ef1742d731ab-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.219605 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.219471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a9c4f42f-35aa-4221-baec-ef1742d731ab-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.219605 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.219506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9c4f42f-35aa-4221-baec-ef1742d731ab-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.219605 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.219544 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a9c4f42f-35aa-4221-baec-ef1742d731ab-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.219605 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.219598 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9c4f42f-35aa-4221-baec-ef1742d731ab-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.219834 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.219628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgzf6\" (UniqueName: \"kubernetes.io/projected/a9c4f42f-35aa-4221-baec-ef1742d731ab-kube-api-access-vgzf6\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.219939 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.219910 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9c4f42f-35aa-4221-baec-ef1742d731ab-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.220054 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.219959 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a9c4f42f-35aa-4221-baec-ef1742d731ab-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.220054 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.220009 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9c4f42f-35aa-4221-baec-ef1742d731ab-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.221815 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.221792 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a9c4f42f-35aa-4221-baec-ef1742d731ab-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.222090 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.222069 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a9c4f42f-35aa-4221-baec-ef1742d731ab-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.228861 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.228839 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgzf6\" (UniqueName: \"kubernetes.io/projected/a9c4f42f-35aa-4221-baec-ef1742d731ab-kube-api-access-vgzf6\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp\" (UID: \"a9c4f42f-35aa-4221-baec-ef1742d731ab\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.317172 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.317137 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:33.447352 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.447325 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp"] Apr 22 19:20:33.448959 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:20:33.448932 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9c4f42f_35aa_4221_baec_ef1742d731ab.slice/crio-c0ff6fdec527aecaa200c758be95536b0f94b14535244a0848a4ffc3438fb130 WatchSource:0}: Error finding container c0ff6fdec527aecaa200c758be95536b0f94b14535244a0848a4ffc3438fb130: Status 404 returned error can't find the container with id c0ff6fdec527aecaa200c758be95536b0f94b14535244a0848a4ffc3438fb130 Apr 22 19:20:33.587200 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.587157 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" event={"ID":"c134dff1-bc78-42ea-b047-1d549a82f049","Type":"ContainerStarted","Data":"2e55d1c2d0c61aeadb13ec3eccfa309fde42707502d2e5406e5b832b3ac6417c"} Apr 22 19:20:33.587617 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.587418 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:33.588648 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.588620 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" event={"ID":"a9c4f42f-35aa-4221-baec-ef1742d731ab","Type":"ContainerStarted","Data":"afc2ef72d582d68dbd4606b262a144dd03dca27dd4a24131636e48536c7d7a09"} Apr 22 19:20:33.588648 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.588648 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" event={"ID":"a9c4f42f-35aa-4221-baec-ef1742d731ab","Type":"ContainerStarted","Data":"c0ff6fdec527aecaa200c758be95536b0f94b14535244a0848a4ffc3438fb130"} Apr 22 19:20:33.611729 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:33.611688 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" podStartSLOduration=7.392534594 podStartE2EDuration="7.611675308s" podCreationTimestamp="2026-04-22 19:20:26 +0000 UTC" firstStartedPulling="2026-04-22 19:20:32.576232636 +0000 UTC m=+857.256710473" lastFinishedPulling="2026-04-22 19:20:32.795373354 +0000 UTC m=+857.475851187" observedRunningTime="2026-04-22 19:20:33.60798003 +0000 UTC m=+858.288457884" watchObservedRunningTime="2026-04-22 19:20:33.611675308 +0000 UTC m=+858.292153213" Apr 22 19:20:38.305970 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.305938 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp"] Apr 22 19:20:38.309666 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.309645 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.312152 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.312130 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 22 19:20:38.321507 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.321478 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp"] Apr 22 19:20:38.465272 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.465232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.465272 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.465275 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phflt\" (UniqueName: \"kubernetes.io/projected/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-kube-api-access-phflt\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.465453 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.465304 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.466488 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.465871 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.466488 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.465972 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.466488 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.466027 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.566871 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.566772 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.566871 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.566831 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.566871 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.566855 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phflt\" (UniqueName: \"kubernetes.io/projected/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-kube-api-access-phflt\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.567146 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.566897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.567146 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.566944 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.567146 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.567027 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.567428 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.567402 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.567520 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.567436 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.567611 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.567584 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.569801 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.569778 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.570041 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.570026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.575113 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.575093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phflt\" (UniqueName: \"kubernetes.io/projected/44ffadf8-ffea-410b-a26f-ed4efc17f6ff-kube-api-access-phflt\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp\" (UID: \"44ffadf8-ffea-410b-a26f-ed4efc17f6ff\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.620561 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.620521 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:38.753087 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:38.753061 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp"] Apr 22 19:20:38.755396 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:20:38.755363 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44ffadf8_ffea_410b_a26f_ed4efc17f6ff.slice/crio-e45cd3708073658e797a4ebd71386f58e7fbb895e0e463d64b00b59a493251e6 WatchSource:0}: Error finding container e45cd3708073658e797a4ebd71386f58e7fbb895e0e463d64b00b59a493251e6: Status 404 returned error can't find the container with id e45cd3708073658e797a4ebd71386f58e7fbb895e0e463d64b00b59a493251e6 Apr 22 19:20:39.611823 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:39.611789 2576 generic.go:358] "Generic (PLEG): container finished" podID="a9c4f42f-35aa-4221-baec-ef1742d731ab" containerID="afc2ef72d582d68dbd4606b262a144dd03dca27dd4a24131636e48536c7d7a09" exitCode=0 Apr 22 19:20:39.612227 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:39.611865 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" event={"ID":"a9c4f42f-35aa-4221-baec-ef1742d731ab","Type":"ContainerDied","Data":"afc2ef72d582d68dbd4606b262a144dd03dca27dd4a24131636e48536c7d7a09"} Apr 22 19:20:39.613531 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:39.613503 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" event={"ID":"44ffadf8-ffea-410b-a26f-ed4efc17f6ff","Type":"ContainerStarted","Data":"6a2d430e3aec25d87864defb6a9b04d12101034869367ef521b217ff159c7169"} Apr 22 19:20:39.613531 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:39.613532 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" event={"ID":"44ffadf8-ffea-410b-a26f-ed4efc17f6ff","Type":"ContainerStarted","Data":"e45cd3708073658e797a4ebd71386f58e7fbb895e0e463d64b00b59a493251e6"} Apr 22 19:20:40.619875 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:40.619839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" event={"ID":"a9c4f42f-35aa-4221-baec-ef1742d731ab","Type":"ContainerStarted","Data":"b84b30645e60535706afe7967cc76bcec480069454801d69d7eb826bda0ce36e"} Apr 22 19:20:40.620373 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:40.620183 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:40.638863 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:40.638815 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" podStartSLOduration=8.46259463 podStartE2EDuration="8.638800501s" podCreationTimestamp="2026-04-22 19:20:32 +0000 UTC" firstStartedPulling="2026-04-22 19:20:39.61261384 +0000 UTC m=+864.293091679" lastFinishedPulling="2026-04-22 19:20:39.788819704 +0000 UTC m=+864.469297550" observedRunningTime="2026-04-22 19:20:40.638003113 +0000 UTC m=+865.318480990" watchObservedRunningTime="2026-04-22 19:20:40.638800501 +0000 UTC m=+865.319278357" Apr 22 19:20:41.585522 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:41.585492 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9" Apr 22 19:20:44.605512 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:44.605482 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl" Apr 22 19:20:44.641764 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:44.641736 2576 generic.go:358] "Generic (PLEG): container finished" podID="44ffadf8-ffea-410b-a26f-ed4efc17f6ff" containerID="6a2d430e3aec25d87864defb6a9b04d12101034869367ef521b217ff159c7169" exitCode=0 Apr 22 19:20:44.642006 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:44.641813 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" event={"ID":"44ffadf8-ffea-410b-a26f-ed4efc17f6ff","Type":"ContainerDied","Data":"6a2d430e3aec25d87864defb6a9b04d12101034869367ef521b217ff159c7169"} Apr 22 19:20:45.646862 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:45.646828 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" event={"ID":"44ffadf8-ffea-410b-a26f-ed4efc17f6ff","Type":"ContainerStarted","Data":"df5d47227d4e4e8439728c6d938b75778af24d2da4edd206df250a36fbed440b"} Apr 22 19:20:45.647310 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:45.647058 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:20:45.665232 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:45.665187 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" podStartSLOduration=7.440122605 podStartE2EDuration="7.66517484s" podCreationTimestamp="2026-04-22 19:20:38 +0000 UTC" firstStartedPulling="2026-04-22 19:20:44.642450509 +0000 UTC m=+869.322928343" lastFinishedPulling="2026-04-22 19:20:44.867502741 +0000 UTC m=+869.547980578" observedRunningTime="2026-04-22 19:20:45.664183196 +0000 UTC m=+870.344661051" watchObservedRunningTime="2026-04-22 19:20:45.66517484 +0000 UTC m=+870.345652694" Apr 22 19:20:51.640615 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:51.640586 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp" Apr 22 19:20:56.663380 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:20:56.663353 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp" Apr 22 19:21:29.527923 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:29.527885 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-64ff4455c7-52nvv"] Apr 22 19:21:29.532786 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:29.532763 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-64ff4455c7-52nvv" Apr 22 19:21:29.541577 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:29.541538 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-64ff4455c7-52nvv"] Apr 22 19:21:29.704767 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:29.704732 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/97af3643-0d0b-4987-9600-d9c7e626f691-tls-cert\") pod \"authorino-64ff4455c7-52nvv\" (UID: \"97af3643-0d0b-4987-9600-d9c7e626f691\") " pod="kuadrant-system/authorino-64ff4455c7-52nvv" Apr 22 19:21:29.704936 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:29.704850 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcrmn\" (UniqueName: \"kubernetes.io/projected/97af3643-0d0b-4987-9600-d9c7e626f691-kube-api-access-qcrmn\") pod \"authorino-64ff4455c7-52nvv\" (UID: \"97af3643-0d0b-4987-9600-d9c7e626f691\") " pod="kuadrant-system/authorino-64ff4455c7-52nvv" Apr 22 19:21:29.806257 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:29.806170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcrmn\" (UniqueName: \"kubernetes.io/projected/97af3643-0d0b-4987-9600-d9c7e626f691-kube-api-access-qcrmn\") pod \"authorino-64ff4455c7-52nvv\" (UID: \"97af3643-0d0b-4987-9600-d9c7e626f691\") " pod="kuadrant-system/authorino-64ff4455c7-52nvv" Apr 22 19:21:29.806257 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:29.806225 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/97af3643-0d0b-4987-9600-d9c7e626f691-tls-cert\") pod \"authorino-64ff4455c7-52nvv\" (UID: \"97af3643-0d0b-4987-9600-d9c7e626f691\") " pod="kuadrant-system/authorino-64ff4455c7-52nvv" Apr 22 19:21:29.808716 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:29.808691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/97af3643-0d0b-4987-9600-d9c7e626f691-tls-cert\") pod \"authorino-64ff4455c7-52nvv\" (UID: \"97af3643-0d0b-4987-9600-d9c7e626f691\") " pod="kuadrant-system/authorino-64ff4455c7-52nvv" Apr 22 19:21:29.816079 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:29.816057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcrmn\" (UniqueName: \"kubernetes.io/projected/97af3643-0d0b-4987-9600-d9c7e626f691-kube-api-access-qcrmn\") pod \"authorino-64ff4455c7-52nvv\" (UID: \"97af3643-0d0b-4987-9600-d9c7e626f691\") " pod="kuadrant-system/authorino-64ff4455c7-52nvv" Apr 22 19:21:29.842697 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:29.842671 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-64ff4455c7-52nvv" Apr 22 19:21:30.171349 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:30.171319 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-64ff4455c7-52nvv"] Apr 22 19:21:30.172746 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:21:30.172719 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97af3643_0d0b_4987_9600_d9c7e626f691.slice/crio-d421b68c59bafbac6ce06d16df41febf1e380a09f8c9c41807513c6734beee55 WatchSource:0}: Error finding container d421b68c59bafbac6ce06d16df41febf1e380a09f8c9c41807513c6734beee55: Status 404 returned error can't find the container with id d421b68c59bafbac6ce06d16df41febf1e380a09f8c9c41807513c6734beee55 Apr 22 19:21:30.820953 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:30.820909 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-64ff4455c7-52nvv" event={"ID":"97af3643-0d0b-4987-9600-d9c7e626f691","Type":"ContainerStarted","Data":"488984f855905621e09885519159938366c3c2c83f7f8629e9c984601ef9ddd2"} Apr 22 19:21:30.820953 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:30.820955 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-64ff4455c7-52nvv" event={"ID":"97af3643-0d0b-4987-9600-d9c7e626f691","Type":"ContainerStarted","Data":"d421b68c59bafbac6ce06d16df41febf1e380a09f8c9c41807513c6734beee55"} Apr 22 19:21:30.837028 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:30.836971 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-64ff4455c7-52nvv" podStartSLOduration=1.320280326 podStartE2EDuration="1.836955416s" podCreationTimestamp="2026-04-22 19:21:29 +0000 UTC" firstStartedPulling="2026-04-22 19:21:30.174199391 +0000 UTC m=+914.854677224" lastFinishedPulling="2026-04-22 19:21:30.69087448 +0000 UTC m=+915.371352314" observedRunningTime="2026-04-22 19:21:30.835468891 +0000 UTC m=+915.515946746" watchObservedRunningTime="2026-04-22 19:21:30.836955416 +0000 UTC m=+915.517433270" Apr 22 19:21:30.860453 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:30.860419 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5549cfc44-kq958"] Apr 22 19:21:30.860698 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:30.860675 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-5549cfc44-kq958" podUID="8f040d06-9c31-401f-83d2-90969b7f25d9" containerName="authorino" containerID="cri-o://15a469231b14a7e285fcd3f03afa3eee7ba4e3a3bc622a446c1bbf325e30ad68" gracePeriod=30 Apr 22 19:21:31.111525 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:31.111502 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5549cfc44-kq958" Apr 22 19:21:31.217006 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:31.216974 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8f040d06-9c31-401f-83d2-90969b7f25d9-tls-cert\") pod \"8f040d06-9c31-401f-83d2-90969b7f25d9\" (UID: \"8f040d06-9c31-401f-83d2-90969b7f25d9\") " Apr 22 19:21:31.217172 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:31.217078 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcg2n\" (UniqueName: \"kubernetes.io/projected/8f040d06-9c31-401f-83d2-90969b7f25d9-kube-api-access-rcg2n\") pod \"8f040d06-9c31-401f-83d2-90969b7f25d9\" (UID: \"8f040d06-9c31-401f-83d2-90969b7f25d9\") " Apr 22 19:21:31.219018 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:31.218989 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f040d06-9c31-401f-83d2-90969b7f25d9-kube-api-access-rcg2n" (OuterVolumeSpecName: "kube-api-access-rcg2n") pod "8f040d06-9c31-401f-83d2-90969b7f25d9" (UID: "8f040d06-9c31-401f-83d2-90969b7f25d9"). InnerVolumeSpecName "kube-api-access-rcg2n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:21:31.229000 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:31.228974 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f040d06-9c31-401f-83d2-90969b7f25d9-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "8f040d06-9c31-401f-83d2-90969b7f25d9" (UID: "8f040d06-9c31-401f-83d2-90969b7f25d9"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:21:31.318778 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:31.318745 2576 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8f040d06-9c31-401f-83d2-90969b7f25d9-tls-cert\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:21:31.318778 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:31.318772 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rcg2n\" (UniqueName: \"kubernetes.io/projected/8f040d06-9c31-401f-83d2-90969b7f25d9-kube-api-access-rcg2n\") on node \"ip-10-0-141-191.ec2.internal\" DevicePath \"\"" Apr 22 19:21:31.826682 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:31.826645 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f040d06-9c31-401f-83d2-90969b7f25d9" containerID="15a469231b14a7e285fcd3f03afa3eee7ba4e3a3bc622a446c1bbf325e30ad68" exitCode=0 Apr 22 19:21:31.827079 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:31.826701 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5549cfc44-kq958" Apr 22 19:21:31.827079 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:31.826732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5549cfc44-kq958" event={"ID":"8f040d06-9c31-401f-83d2-90969b7f25d9","Type":"ContainerDied","Data":"15a469231b14a7e285fcd3f03afa3eee7ba4e3a3bc622a446c1bbf325e30ad68"} Apr 22 19:21:31.827079 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:31.826771 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5549cfc44-kq958" event={"ID":"8f040d06-9c31-401f-83d2-90969b7f25d9","Type":"ContainerDied","Data":"375d84c2e8a625703062d1f82a1e2e634effb2edc972e573f3c5dc78ec3b9bd7"} Apr 22 19:21:31.827079 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:31.826788 2576 scope.go:117] "RemoveContainer" containerID="15a469231b14a7e285fcd3f03afa3eee7ba4e3a3bc622a446c1bbf325e30ad68" Apr 22 19:21:31.837771 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:31.837740 2576 scope.go:117] "RemoveContainer" containerID="15a469231b14a7e285fcd3f03afa3eee7ba4e3a3bc622a446c1bbf325e30ad68" Apr 22 19:21:31.838390 ip-10-0-141-191 kubenswrapper[2576]: E0422 19:21:31.838367 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15a469231b14a7e285fcd3f03afa3eee7ba4e3a3bc622a446c1bbf325e30ad68\": container with ID starting with 15a469231b14a7e285fcd3f03afa3eee7ba4e3a3bc622a446c1bbf325e30ad68 not found: ID does not exist" containerID="15a469231b14a7e285fcd3f03afa3eee7ba4e3a3bc622a446c1bbf325e30ad68" Apr 22 19:21:31.838604 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:31.838535 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15a469231b14a7e285fcd3f03afa3eee7ba4e3a3bc622a446c1bbf325e30ad68"} err="failed to get container status \"15a469231b14a7e285fcd3f03afa3eee7ba4e3a3bc622a446c1bbf325e30ad68\": rpc error: code = NotFound desc = could not find container \"15a469231b14a7e285fcd3f03afa3eee7ba4e3a3bc622a446c1bbf325e30ad68\": container with ID starting with 15a469231b14a7e285fcd3f03afa3eee7ba4e3a3bc622a446c1bbf325e30ad68 not found: ID does not exist" Apr 22 19:21:31.856372 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:31.856339 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5549cfc44-kq958"] Apr 22 19:21:31.862316 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:31.862293 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-5549cfc44-kq958"] Apr 22 19:21:31.887790 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:21:31.887755 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f040d06-9c31-401f-83d2-90969b7f25d9" path="/var/lib/kubelet/pods/8f040d06-9c31-401f-83d2-90969b7f25d9/volumes" Apr 22 19:23:42.963975 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:42.963939 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-64ff4455c7-52nvv_97af3643-0d0b-4987-9600-d9c7e626f691/authorino/0.log" Apr 22 19:23:46.763235 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:46.763190 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-cbb586f7c-bj2br_910d0f8d-0439-4b24-9666-7335d35d534b/maas-api/0.log" Apr 22 19:23:46.869436 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:46.869397 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-6dcdf69c54-tkw65_c258c08c-38d0-4961-a2f5-f6ff503f09af/manager/0.log" Apr 22 19:23:47.230049 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:47.229961 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c9fd8c974-wqwc6_b9b5dba6-27d1-4353-968a-0049be88faea/manager/0.log" Apr 22 19:23:48.187384 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:48.187359 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w_dcea311f-afff-4b14-874c-c48c8cfda339/util/0.log" Apr 22 19:23:48.193914 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:48.193895 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w_dcea311f-afff-4b14-874c-c48c8cfda339/pull/0.log" Apr 22 19:23:48.199691 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:48.199674 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w_dcea311f-afff-4b14-874c-c48c8cfda339/extract/0.log" Apr 22 19:23:48.303608 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:48.303577 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g_021e828f-b756-4c70-91a0-53be754f90be/util/0.log" Apr 22 19:23:48.309900 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:48.309879 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g_021e828f-b756-4c70-91a0-53be754f90be/pull/0.log" Apr 22 19:23:48.315722 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:48.315701 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g_021e828f-b756-4c70-91a0-53be754f90be/extract/0.log" Apr 22 19:23:48.419046 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:48.419014 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl_36838a31-9166-433f-a56b-d65afe7fccc0/util/0.log" Apr 22 19:23:48.424862 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:48.424842 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl_36838a31-9166-433f-a56b-d65afe7fccc0/pull/0.log" Apr 22 19:23:48.430705 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:48.430687 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl_36838a31-9166-433f-a56b-d65afe7fccc0/extract/0.log" Apr 22 19:23:48.535947 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:48.535918 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c_367c1472-8cda-4c0c-9f69-b79028800176/util/0.log" Apr 22 19:23:48.541971 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:48.541946 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c_367c1472-8cda-4c0c-9f69-b79028800176/pull/0.log" Apr 22 19:23:48.547889 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:48.547865 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c_367c1472-8cda-4c0c-9f69-b79028800176/extract/0.log" Apr 22 19:23:48.653244 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:48.653213 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-64ff4455c7-52nvv_97af3643-0d0b-4987-9600-d9c7e626f691/authorino/0.log" Apr 22 19:23:48.865140 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:48.865063 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-77t92_64b14dcf-2441-4902-b01e-c729cb4558b5/manager/0.log" Apr 22 19:23:49.068343 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:49.068313 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-bwlnk_b8c23efc-a33b-4301-a2cd-a5e76299c388/registry-server/0.log" Apr 22 19:23:49.182214 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:49.182145 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-m5lrk_5c2d426d-3561-4560-8644-9417aff439ba/manager/0.log" Apr 22 19:23:49.733886 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:49.733854 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t_778ca226-228b-49c6-af36-d9c52e4ed5e0/istio-proxy/0.log" Apr 22 19:23:50.157123 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:50.157091 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-4hzds_4c504779-1d51-402a-846b-cd36eb6c5927/istio-proxy/0.log" Apr 22 19:23:50.574350 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:50.574321 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp_a9c4f42f-35aa-4221-baec-ef1742d731ab/storage-initializer/0.log" Apr 22 19:23:50.580891 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:50.580871 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-vk2fp_a9c4f42f-35aa-4221-baec-ef1742d731ab/main/0.log" Apr 22 19:23:50.685653 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:50.685625 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl_c134dff1-bc78-42ea-b047-1d549a82f049/main/0.log" Apr 22 19:23:50.692010 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:50.691990 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-hfkrl_c134dff1-bc78-42ea-b047-1d549a82f049/storage-initializer/0.log" Apr 22 19:23:50.903391 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:50.903314 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp_44ffadf8-ffea-410b-a26f-ed4efc17f6ff/storage-initializer/0.log" Apr 22 19:23:50.909291 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:50.909270 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcck9gcp_44ffadf8-ffea-410b-a26f-ed4efc17f6ff/main/0.log" Apr 22 19:23:51.012313 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:51.012278 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9_157e30fa-f8ef-4f33-aec5-c34208a28d25/storage-initializer/0.log" Apr 22 19:23:51.018986 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:51.018968 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-758s9_157e30fa-f8ef-4f33-aec5-c34208a28d25/main/0.log" Apr 22 19:23:58.289709 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:58.289675 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-tksss_75bd4507-59f4-478b-8153-398fc3f4f109/global-pull-secret-syncer/0.log" Apr 22 19:23:58.387888 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:58.387858 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-z48vt_677eb918-a02c-47e7-853e-5d091e94e4e3/konnectivity-agent/0.log" Apr 22 19:23:58.455842 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:23:58.455811 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-191.ec2.internal_967a4594b2f65f314129e2fffff4e1e6/haproxy/0.log" Apr 22 19:24:02.374371 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:02.374341 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w_dcea311f-afff-4b14-874c-c48c8cfda339/extract/0.log" Apr 22 19:24:02.398899 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:02.398873 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w_dcea311f-afff-4b14-874c-c48c8cfda339/util/0.log" Apr 22 19:24:02.420595 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:02.420568 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759c6n6w_dcea311f-afff-4b14-874c-c48c8cfda339/pull/0.log" Apr 22 19:24:02.456015 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:02.455985 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g_021e828f-b756-4c70-91a0-53be754f90be/extract/0.log" Apr 22 19:24:02.476995 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:02.476968 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g_021e828f-b756-4c70-91a0-53be754f90be/util/0.log" Apr 22 19:24:02.498918 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:02.498891 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0ldw7g_021e828f-b756-4c70-91a0-53be754f90be/pull/0.log" Apr 22 19:24:02.526308 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:02.526284 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl_36838a31-9166-433f-a56b-d65afe7fccc0/extract/0.log" Apr 22 19:24:02.551505 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:02.551478 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl_36838a31-9166-433f-a56b-d65afe7fccc0/util/0.log" Apr 22 19:24:02.576703 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:02.576676 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bzlfl_36838a31-9166-433f-a56b-d65afe7fccc0/pull/0.log" Apr 22 19:24:02.605908 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:02.605883 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c_367c1472-8cda-4c0c-9f69-b79028800176/extract/0.log" Apr 22 19:24:02.628518 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:02.628448 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c_367c1472-8cda-4c0c-9f69-b79028800176/util/0.log" Apr 22 19:24:02.651082 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:02.651050 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km64c_367c1472-8cda-4c0c-9f69-b79028800176/pull/0.log" Apr 22 19:24:02.684423 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:02.684399 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-64ff4455c7-52nvv_97af3643-0d0b-4987-9600-d9c7e626f691/authorino/0.log" Apr 22 19:24:02.741315 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:02.741285 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-77t92_64b14dcf-2441-4902-b01e-c729cb4558b5/manager/0.log" Apr 22 19:24:02.800037 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:02.800008 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-bwlnk_b8c23efc-a33b-4301-a2cd-a5e76299c388/registry-server/0.log" Apr 22 19:24:02.840766 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:02.840738 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-m5lrk_5c2d426d-3561-4560-8644-9417aff439ba/manager/0.log" Apr 22 19:24:04.816452 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:04.816374 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cg86r_a9918c62-5567-45be-abb1-fef6111f9bf1/node-exporter/0.log" Apr 22 19:24:04.842361 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:04.842335 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cg86r_a9918c62-5567-45be-abb1-fef6111f9bf1/kube-rbac-proxy/0.log" Apr 22 19:24:04.863638 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:04.863615 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cg86r_a9918c62-5567-45be-abb1-fef6111f9bf1/init-textfile/0.log" Apr 22 19:24:04.973894 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:04.973861 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-52hs9_6334ddf3-d9ef-4f14-ab90-9f695eabbfb8/kube-rbac-proxy-main/0.log" Apr 22 19:24:04.995934 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:04.995901 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-52hs9_6334ddf3-d9ef-4f14-ab90-9f695eabbfb8/kube-rbac-proxy-self/0.log" Apr 22 19:24:05.017578 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:05.017543 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-52hs9_6334ddf3-d9ef-4f14-ab90-9f695eabbfb8/openshift-state-metrics/0.log" Apr 22 19:24:05.275825 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:05.275798 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-vmm9h_f2533978-99d2-4933-bde6-49394145a235/prometheus-operator-admission-webhook/0.log" Apr 22 19:24:06.449966 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.449933 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr"] Apr 22 19:24:06.450324 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.450281 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f040d06-9c31-401f-83d2-90969b7f25d9" containerName="authorino" Apr 22 19:24:06.450324 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.450292 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f040d06-9c31-401f-83d2-90969b7f25d9" containerName="authorino" Apr 22 19:24:06.450398 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.450343 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f040d06-9c31-401f-83d2-90969b7f25d9" containerName="authorino" Apr 22 19:24:06.453513 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.453496 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:06.455701 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.455677 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tdl99\"/\"default-dockercfg-ncqxp\"" Apr 22 19:24:06.455915 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.455900 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tdl99\"/\"openshift-service-ca.crt\"" Apr 22 19:24:06.456789 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.456771 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tdl99\"/\"kube-root-ca.crt\"" Apr 22 19:24:06.460626 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.460603 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr"] Apr 22 19:24:06.533847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.533815 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/84a45bea-3590-4176-b2a7-dd4fd4322a49-proc\") pod \"perf-node-gather-daemonset-kzqvr\" (UID: \"84a45bea-3590-4176-b2a7-dd4fd4322a49\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:06.533847 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.533860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84a45bea-3590-4176-b2a7-dd4fd4322a49-sys\") pod \"perf-node-gather-daemonset-kzqvr\" (UID: \"84a45bea-3590-4176-b2a7-dd4fd4322a49\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:06.534096 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.533962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/84a45bea-3590-4176-b2a7-dd4fd4322a49-podres\") pod \"perf-node-gather-daemonset-kzqvr\" (UID: \"84a45bea-3590-4176-b2a7-dd4fd4322a49\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:06.534096 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.534001 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdrwq\" (UniqueName: \"kubernetes.io/projected/84a45bea-3590-4176-b2a7-dd4fd4322a49-kube-api-access-jdrwq\") pod \"perf-node-gather-daemonset-kzqvr\" (UID: \"84a45bea-3590-4176-b2a7-dd4fd4322a49\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:06.534167 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.534109 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/84a45bea-3590-4176-b2a7-dd4fd4322a49-lib-modules\") pod \"perf-node-gather-daemonset-kzqvr\" (UID: \"84a45bea-3590-4176-b2a7-dd4fd4322a49\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:06.634950 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.634912 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/84a45bea-3590-4176-b2a7-dd4fd4322a49-lib-modules\") pod \"perf-node-gather-daemonset-kzqvr\" (UID: \"84a45bea-3590-4176-b2a7-dd4fd4322a49\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:06.634950 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.634955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/84a45bea-3590-4176-b2a7-dd4fd4322a49-proc\") pod \"perf-node-gather-daemonset-kzqvr\" (UID: \"84a45bea-3590-4176-b2a7-dd4fd4322a49\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:06.635194 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.634982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84a45bea-3590-4176-b2a7-dd4fd4322a49-sys\") pod \"perf-node-gather-daemonset-kzqvr\" (UID: \"84a45bea-3590-4176-b2a7-dd4fd4322a49\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:06.635194 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.635018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/84a45bea-3590-4176-b2a7-dd4fd4322a49-podres\") pod \"perf-node-gather-daemonset-kzqvr\" (UID: \"84a45bea-3590-4176-b2a7-dd4fd4322a49\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:06.635194 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.635036 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdrwq\" (UniqueName: \"kubernetes.io/projected/84a45bea-3590-4176-b2a7-dd4fd4322a49-kube-api-access-jdrwq\") pod \"perf-node-gather-daemonset-kzqvr\" (UID: \"84a45bea-3590-4176-b2a7-dd4fd4322a49\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:06.635194 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.635102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84a45bea-3590-4176-b2a7-dd4fd4322a49-sys\") pod \"perf-node-gather-daemonset-kzqvr\" (UID: \"84a45bea-3590-4176-b2a7-dd4fd4322a49\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:06.635194 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.635121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/84a45bea-3590-4176-b2a7-dd4fd4322a49-proc\") pod \"perf-node-gather-daemonset-kzqvr\" (UID: \"84a45bea-3590-4176-b2a7-dd4fd4322a49\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:06.635194 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.635102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/84a45bea-3590-4176-b2a7-dd4fd4322a49-lib-modules\") pod \"perf-node-gather-daemonset-kzqvr\" (UID: \"84a45bea-3590-4176-b2a7-dd4fd4322a49\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:06.635194 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.635149 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/84a45bea-3590-4176-b2a7-dd4fd4322a49-podres\") pod \"perf-node-gather-daemonset-kzqvr\" (UID: \"84a45bea-3590-4176-b2a7-dd4fd4322a49\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:06.644255 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.644230 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdrwq\" (UniqueName: \"kubernetes.io/projected/84a45bea-3590-4176-b2a7-dd4fd4322a49-kube-api-access-jdrwq\") pod \"perf-node-gather-daemonset-kzqvr\" (UID: \"84a45bea-3590-4176-b2a7-dd4fd4322a49\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:06.765058 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.765017 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:06.893331 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:06.893301 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr"] Apr 22 19:24:06.894645 ip-10-0-141-191 kubenswrapper[2576]: W0422 19:24:06.894615 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod84a45bea_3590_4176_b2a7_dd4fd4322a49.slice/crio-b0d792ea5c67914864f64ce0091a70bea9ca4f4efcf4f121759e92c5ed499725 WatchSource:0}: Error finding container b0d792ea5c67914864f64ce0091a70bea9ca4f4efcf4f121759e92c5ed499725: Status 404 returned error can't find the container with id b0d792ea5c67914864f64ce0091a70bea9ca4f4efcf4f121759e92c5ed499725 Apr 22 19:24:07.425785 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:07.425746 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" event={"ID":"84a45bea-3590-4176-b2a7-dd4fd4322a49","Type":"ContainerStarted","Data":"3356d6004444c6046d67c378a25f797a03f86c4018a2f4304012e1c1a5884305"} Apr 22 19:24:07.425785 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:07.425792 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" event={"ID":"84a45bea-3590-4176-b2a7-dd4fd4322a49","Type":"ContainerStarted","Data":"b0d792ea5c67914864f64ce0091a70bea9ca4f4efcf4f121759e92c5ed499725"} Apr 22 19:24:07.426154 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:07.425897 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:07.441318 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:07.441272 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" podStartSLOduration=1.4412591510000001 podStartE2EDuration="1.441259151s" podCreationTimestamp="2026-04-22 19:24:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:24:07.439888312 +0000 UTC m=+1072.120366168" watchObservedRunningTime="2026-04-22 19:24:07.441259151 +0000 UTC m=+1072.121737054" Apr 22 19:24:08.804636 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:08.804608 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-467kd_f0879ed5-18cc-4265-8956-15d1b97cade2/dns/0.log" Apr 22 19:24:08.825513 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:08.825487 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-467kd_f0879ed5-18cc-4265-8956-15d1b97cade2/kube-rbac-proxy/0.log" Apr 22 19:24:08.958031 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:08.958003 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9gplk_eb917968-4a52-4305-8b42-7cfc0d5bf83c/dns-node-resolver/0.log" Apr 22 19:24:09.457197 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:09.457170 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sqb4x_e0f709c0-fc28-4eab-9cf8-603681f7f300/node-ca/0.log" Apr 22 19:24:10.274102 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:10.274072 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfr8h4t_778ca226-228b-49c6-af36-d9c52e4ed5e0/istio-proxy/0.log" Apr 22 19:24:10.397342 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:10.397308 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-4hzds_4c504779-1d51-402a-846b-cd36eb6c5927/istio-proxy/0.log" Apr 22 19:24:10.962691 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:10.962665 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-772dj_0c333e53-81f1-4b5a-91a4-6aad9cbe63aa/serve-healthcheck-canary/0.log" Apr 22 19:24:11.448476 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:11.448442 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8z9lw_0739b7b9-36a6-4c2f-aafb-af6c20c38569/kube-rbac-proxy/0.log" Apr 22 19:24:11.468333 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:11.468303 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8z9lw_0739b7b9-36a6-4c2f-aafb-af6c20c38569/exporter/0.log" Apr 22 19:24:11.493224 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:11.493185 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8z9lw_0739b7b9-36a6-4c2f-aafb-af6c20c38569/extractor/0.log" Apr 22 19:24:13.438713 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:13.438685 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-kzqvr" Apr 22 19:24:13.512391 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:13.512355 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-cbb586f7c-bj2br_910d0f8d-0439-4b24-9666-7335d35d534b/maas-api/0.log" Apr 22 19:24:13.539483 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:13.539451 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-6dcdf69c54-tkw65_c258c08c-38d0-4961-a2f5-f6ff503f09af/manager/0.log" Apr 22 19:24:13.629672 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:13.629629 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c9fd8c974-wqwc6_b9b5dba6-27d1-4353-968a-0049be88faea/manager/0.log" Apr 22 19:24:14.797836 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:14.797803 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6769c56bf6-p5fkh_bc21ea98-0bde-4948-a397-97e39c96fd9a/manager/0.log" Apr 22 19:24:14.848179 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:14.848152 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-7bx5r_db1ba053-7c0b-4491-92b5-535da3303c77/openshift-lws-operator/0.log" Apr 22 19:24:20.617657 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:20.617630 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sgjxl_0d1ae74c-067a-489d-a7eb-707cf1b181a7/kube-multus-additional-cni-plugins/0.log" Apr 22 19:24:20.637921 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:20.637894 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sgjxl_0d1ae74c-067a-489d-a7eb-707cf1b181a7/egress-router-binary-copy/0.log" Apr 22 19:24:20.658866 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:20.658837 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sgjxl_0d1ae74c-067a-489d-a7eb-707cf1b181a7/cni-plugins/0.log" Apr 22 19:24:20.680789 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:20.680766 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sgjxl_0d1ae74c-067a-489d-a7eb-707cf1b181a7/bond-cni-plugin/0.log" Apr 22 19:24:20.700365 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:20.700344 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sgjxl_0d1ae74c-067a-489d-a7eb-707cf1b181a7/routeoverride-cni/0.log" Apr 22 19:24:20.724041 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:20.724015 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sgjxl_0d1ae74c-067a-489d-a7eb-707cf1b181a7/whereabouts-cni-bincopy/0.log" Apr 22 19:24:20.747945 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:20.747924 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sgjxl_0d1ae74c-067a-489d-a7eb-707cf1b181a7/whereabouts-cni/0.log" Apr 22 19:24:20.949460 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:20.949383 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ktwxr_8b12af7b-06a9-4788-95b8-dc94a26738fe/kube-multus/0.log" Apr 22 19:24:21.061264 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:21.061235 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gk4zn_42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc/network-metrics-daemon/0.log" Apr 22 19:24:21.079040 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:21.079021 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gk4zn_42a195d9-ce8e-4dc5-9cb1-8e23ae31d6bc/kube-rbac-proxy/0.log" Apr 22 19:24:21.938686 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:21.938650 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f4rv_65c2e60e-a686-4be8-bb8d-33be235b8b32/ovn-controller/0.log" Apr 22 19:24:21.970225 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:21.970196 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f4rv_65c2e60e-a686-4be8-bb8d-33be235b8b32/ovn-acl-logging/0.log" Apr 22 19:24:21.995373 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:21.995342 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f4rv_65c2e60e-a686-4be8-bb8d-33be235b8b32/kube-rbac-proxy-node/0.log" Apr 22 19:24:22.028810 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:22.028781 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f4rv_65c2e60e-a686-4be8-bb8d-33be235b8b32/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:24:22.050688 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:22.050659 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f4rv_65c2e60e-a686-4be8-bb8d-33be235b8b32/northd/0.log" Apr 22 19:24:22.073349 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:22.073324 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f4rv_65c2e60e-a686-4be8-bb8d-33be235b8b32/nbdb/0.log" Apr 22 19:24:22.098006 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:22.097978 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f4rv_65c2e60e-a686-4be8-bb8d-33be235b8b32/sbdb/0.log" Apr 22 19:24:22.193066 ip-10-0-141-191 kubenswrapper[2576]: I0422 19:24:22.192979 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7f4rv_65c2e60e-a686-4be8-bb8d-33be235b8b32/ovnkube-controller/0.log"