Apr 21 04:21:09.133430 ip-10-0-140-163 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 04:21:09.133441 ip-10-0-140-163 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 04:21:09.133448 ip-10-0-140-163 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 04:21:09.133731 ip-10-0-140-163 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 04:21:19.158603 ip-10-0-140-163 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 04:21:19.158622 ip-10-0-140-163 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot bb76cfcee49f44438ab5354fdb2f7db5 -- Apr 21 04:23:41.028004 ip-10-0-140-163 systemd[1]: Starting Kubernetes Kubelet... Apr 21 04:23:41.465560 ip-10-0-140-163 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:23:41.465560 ip-10-0-140-163 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 04:23:41.465560 ip-10-0-140-163 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:23:41.465560 ip-10-0-140-163 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 04:23:41.465560 ip-10-0-140-163 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:23:41.466421 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.466334 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 04:23:41.471411 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471389 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:23:41.471411 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471409 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:23:41.471411 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471415 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:23:41.471411 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471418 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471421 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471424 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471427 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471430 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471433 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471436 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471439 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471441 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471444 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471446 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471449 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471451 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471454 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471456 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471459 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471462 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471464 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471467 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471470 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:23:41.471578 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471472 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471478 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471481 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471484 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471488 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471491 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471493 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471496 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471500 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471503 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471506 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471509 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471512 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471514 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471517 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471521 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471524 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471526 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471529 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471531 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:23:41.472086 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471534 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471537 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471539 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471542 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471545 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471547 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471550 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471553 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471555 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471558 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471560 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471563 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471565 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471568 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471570 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471573 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471575 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471578 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471581 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471584 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:23:41.472608 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471587 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471590 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471592 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471595 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471597 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471600 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471604 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471606 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471609 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471612 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471614 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471617 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471619 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471623 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471626 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471630 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471642 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471645 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471648 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471650 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:23:41.473110 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471653 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471655 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.471658 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472068 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472073 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472076 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472079 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472082 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472084 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472088 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472092 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472096 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472098 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472101 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472104 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472106 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472109 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472112 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472115 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472118 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:23:41.473588 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472121 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472124 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472126 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472128 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472131 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472134 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472136 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472140 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472142 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472145 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472148 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472150 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472153 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472155 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472158 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472160 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472164 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472166 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472169 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472172 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:23:41.474081 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472174 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472177 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472179 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472182 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472184 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472187 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472189 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472192 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472196 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472200 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472203 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472206 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472209 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472212 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472214 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472217 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472220 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472222 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472225 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:23:41.474585 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472227 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472230 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472233 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472236 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472239 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472241 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472244 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472246 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472249 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472252 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472254 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472257 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472259 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472262 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472264 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472266 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472269 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472271 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472274 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472277 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:23:41.475075 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472279 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472281 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472284 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472287 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472290 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472292 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472295 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472297 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472300 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472303 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472377 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472385 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472391 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472396 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472401 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472404 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472409 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472414 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472417 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472421 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472424 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 04:23:41.475596 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472427 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472430 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472433 2574 flags.go:64] FLAG: --cgroup-root="" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472436 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472439 2574 flags.go:64] FLAG: --client-ca-file="" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472442 2574 flags.go:64] FLAG: --cloud-config="" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472445 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472448 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472452 2574 flags.go:64] FLAG: --cluster-domain="" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472455 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472458 2574 flags.go:64] FLAG: --config-dir="" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472461 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472464 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472468 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472472 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472475 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472478 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472481 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472484 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472487 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472490 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472493 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472497 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472501 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472504 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 04:23:41.476134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472507 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472510 2574 flags.go:64] FLAG: --enable-server="true" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472513 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472518 2574 flags.go:64] FLAG: --event-burst="100" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472521 2574 flags.go:64] FLAG: --event-qps="50" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472524 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472528 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472531 2574 flags.go:64] FLAG: --eviction-hard="" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472534 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472537 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472540 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472543 2574 flags.go:64] FLAG: --eviction-soft="" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472546 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472549 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472552 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472555 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472558 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472561 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472564 2574 flags.go:64] FLAG: --feature-gates="" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472568 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472571 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472574 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472577 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472580 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472583 2574 flags.go:64] FLAG: --help="false" Apr 21 04:23:41.476734 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472586 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-140-163.ec2.internal" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472590 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472592 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472595 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472599 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472602 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472605 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472608 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472611 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472614 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472617 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472621 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472624 2574 flags.go:64] FLAG: --kube-reserved="" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472627 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472630 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472633 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472635 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472638 2574 flags.go:64] FLAG: --lock-file="" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472641 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472644 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472647 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472652 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472655 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472658 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 04:23:41.477338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472661 2574 flags.go:64] FLAG: --logging-format="text" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472664 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472667 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472670 2574 flags.go:64] FLAG: --manifest-url="" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472677 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472686 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472689 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472693 2574 flags.go:64] FLAG: --max-pods="110" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472696 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472699 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472702 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472705 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472709 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472712 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472715 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472736 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472740 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472743 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472747 2574 flags.go:64] FLAG: --pod-cidr="" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472750 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472755 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472758 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472761 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472764 2574 flags.go:64] FLAG: --port="10250" Apr 21 04:23:41.477938 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472767 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472770 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00591c678d9d2ced6" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472773 2574 flags.go:64] FLAG: --qos-reserved="" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472776 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472779 2574 flags.go:64] FLAG: --register-node="true" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472782 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472785 2574 flags.go:64] FLAG: --register-with-taints="" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472789 2574 flags.go:64] FLAG: --registry-burst="10" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472791 2574 flags.go:64] FLAG: --registry-qps="5" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472794 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472797 2574 flags.go:64] FLAG: --reserved-memory="" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472801 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472804 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472808 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472811 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472814 2574 flags.go:64] FLAG: --runonce="false" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472817 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472820 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472823 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472826 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472830 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472833 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472836 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472839 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472842 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472845 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 04:23:41.478503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472848 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472851 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472854 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472857 2574 flags.go:64] FLAG: --system-cgroups="" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472860 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472865 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472868 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472871 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472876 2574 flags.go:64] FLAG: --tls-min-version="" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472879 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472881 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472884 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472887 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472891 2574 flags.go:64] FLAG: --v="2" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472895 2574 flags.go:64] FLAG: --version="false" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472899 2574 flags.go:64] FLAG: --vmodule="" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472903 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.472906 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.472997 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473002 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473005 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473008 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473011 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473014 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:23:41.479141 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473017 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473020 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473024 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473027 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473030 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473033 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473035 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473038 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473041 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473045 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473049 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473052 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473055 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473059 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473062 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473065 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473068 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473071 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473074 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:23:41.479735 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473077 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473079 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473082 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473085 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473087 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473090 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473092 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473095 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473099 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473101 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473104 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473106 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473109 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473112 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473114 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473121 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473124 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473126 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473129 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473131 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:23:41.480247 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473134 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473137 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473139 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473142 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473145 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473148 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473151 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473153 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473156 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473158 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473161 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473163 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473166 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473168 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473171 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473174 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473176 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473178 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473181 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:23:41.480733 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473184 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473187 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473190 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473193 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473195 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473198 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473200 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473203 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473207 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473209 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473212 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473214 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473217 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473219 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473222 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473225 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473227 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473230 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473233 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473235 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:23:41.481200 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473238 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:23:41.481686 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.473241 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:23:41.481686 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.474163 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:23:41.481686 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.481142 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 04:23:41.481686 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.481159 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 04:23:41.481686 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481221 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:23:41.481686 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481227 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:23:41.481686 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481230 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:23:41.481686 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481233 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:23:41.481686 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481236 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:23:41.481686 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481239 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:23:41.481686 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481241 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:23:41.481686 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481244 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:23:41.481686 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481247 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:23:41.481686 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481250 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:23:41.481686 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481252 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:23:41.481686 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481255 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481257 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481260 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481263 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481265 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481268 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481270 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481273 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481276 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481278 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481280 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481283 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481286 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481288 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481292 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481294 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481297 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481299 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481302 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481304 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:23:41.482116 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481307 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481310 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481312 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481315 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481318 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481320 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481323 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481325 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481328 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481331 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481333 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481335 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481338 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481340 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481343 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481345 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481348 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481351 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481353 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481356 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:23:41.482616 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481358 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481361 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481363 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481366 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481368 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481371 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481374 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481377 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481379 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481382 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481384 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481387 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481390 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481393 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481396 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481399 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481402 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481406 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481410 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:23:41.483161 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481414 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:23:41.483621 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481417 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:23:41.483621 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481419 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:23:41.483621 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481422 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:23:41.483621 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481425 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:23:41.483621 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481428 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:23:41.483621 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481431 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:23:41.483621 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481433 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:23:41.483621 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481436 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:23:41.483621 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481439 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:23:41.483621 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481441 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:23:41.483621 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481444 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:23:41.483621 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481446 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:23:41.483621 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481449 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:23:41.483621 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481451 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:23:41.483621 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481454 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:23:41.483621 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.481459 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481559 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481564 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481567 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481570 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481574 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481577 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481579 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481582 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481586 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481589 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481592 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481595 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481597 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481600 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481602 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481605 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481608 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481610 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:23:41.484049 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481613 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481615 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481618 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481620 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481623 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481625 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481628 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481630 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481633 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481636 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481638 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481640 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481643 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481645 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481648 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481650 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481653 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481655 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481658 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481662 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:23:41.484561 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481664 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481667 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481670 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481673 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481675 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481678 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481680 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481683 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481686 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481688 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481691 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481693 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481696 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481698 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481701 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481703 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481706 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481708 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481711 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481713 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:23:41.485068 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481716 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481718 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481735 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481738 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481741 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481743 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481746 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481748 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481752 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481755 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481758 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481761 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481764 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481767 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481770 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481772 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481775 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481778 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481780 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:23:41.485565 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481782 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:23:41.486112 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481785 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:23:41.486112 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481788 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:23:41.486112 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481790 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:23:41.486112 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481792 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:23:41.486112 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481795 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:23:41.486112 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481797 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:23:41.486112 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481800 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:23:41.486112 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:41.481802 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:23:41.486112 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.481807 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:23:41.486112 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.481903 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 04:23:41.486112 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.483824 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 04:23:41.486112 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.485174 2574 server.go:1019] "Starting client certificate rotation" Apr 21 04:23:41.486112 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.485266 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 04:23:41.486112 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.485305 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 04:23:41.511010 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.510992 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 04:23:41.516608 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.516582 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 04:23:41.537243 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.537225 2574 log.go:25] "Validated CRI v1 runtime API" Apr 21 04:23:41.542707 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.542693 2574 log.go:25] "Validated CRI v1 image API" Apr 21 04:23:41.543920 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.543904 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 04:23:41.548847 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.548824 2574 fs.go:135] Filesystem UUIDs: map[4c8ad389-e76a-4d02-9862-a42e7d45efdd:/dev/nvme0n1p4 6cef1655-4d95-4da6-b861-5ed64594eaef:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 21 04:23:41.548923 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.548845 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 04:23:41.553889 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.553770 2574 manager.go:217] Machine: {Timestamp:2026-04-21 04:23:41.552624343 +0000 UTC m=+0.412728313 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100312 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec286607101333c5cf15bc84058af0b2 SystemUUID:ec286607-1013-33c5-cf15-bc84058af0b2 BootID:bb76cfce-e49f-4443-8ab5-354fdb2f7db5 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:44:f7:e3:d1:1f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:44:f7:e3:d1:1f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ee:03:e4:7f:99:ee Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 04:23:41.553889 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.553877 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 04:23:41.554023 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.553989 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 04:23:41.555493 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.555476 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 04:23:41.555927 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.555905 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 04:23:41.556086 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.555930 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-163.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 04:23:41.556136 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.556095 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 04:23:41.556136 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.556103 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 04:23:41.556136 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.556117 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 04:23:41.556136 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.556133 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 04:23:41.556895 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.556885 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 21 04:23:41.557001 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.556992 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 04:23:41.559198 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.559188 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 21 04:23:41.559238 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.559202 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 04:23:41.559991 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.559982 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 04:23:41.560032 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.559996 2574 kubelet.go:397] "Adding apiserver pod source" Apr 21 04:23:41.560032 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.560006 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 04:23:41.560976 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.560966 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 04:23:41.561024 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.560983 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 04:23:41.564198 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.564179 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 04:23:41.565463 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.565450 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 04:23:41.567094 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.567079 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 04:23:41.567094 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.567098 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 04:23:41.567237 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.567104 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 04:23:41.567237 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.567111 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 04:23:41.567237 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.567122 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 04:23:41.567237 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.567132 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 04:23:41.567237 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.567138 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 04:23:41.567237 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.567144 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 04:23:41.567237 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.567151 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 04:23:41.567237 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.567157 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 04:23:41.567237 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.567174 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 04:23:41.567237 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.567184 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 04:23:41.568946 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.568934 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 04:23:41.568946 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.568945 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 04:23:41.572487 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.572473 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 04:23:41.572552 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.572518 2574 server.go:1295] "Started kubelet" Apr 21 04:23:41.573421 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.573290 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 04:23:41.573562 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.573496 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-163.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 04:23:41.573755 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:41.573682 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-163.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 04:23:41.573877 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:41.573860 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 04:23:41.573935 ip-10-0-140-163 systemd[1]: Started Kubernetes Kubelet. Apr 21 04:23:41.574096 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.574058 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 04:23:41.574157 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.574121 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 04:23:41.575503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.575482 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 04:23:41.576337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.576320 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 21 04:23:41.581025 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.581008 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 04:23:41.581116 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.581028 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 04:23:41.583596 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.583342 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 04:23:41.583596 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.583546 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 04:23:41.583596 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.583474 2574 factory.go:55] Registering systemd factory Apr 21 04:23:41.583792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.583606 2574 factory.go:223] Registration of the systemd container factory successfully Apr 21 04:23:41.583792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.583368 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 04:23:41.583792 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:41.583675 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-163.ec2.internal\" not found" Apr 21 04:23:41.583792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.583748 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 21 04:23:41.583792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.583757 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 21 04:23:41.584003 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.583836 2574 factory.go:153] Registering CRI-O factory Apr 21 04:23:41.584003 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.583852 2574 factory.go:223] Registration of the crio container factory successfully Apr 21 04:23:41.584003 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.583945 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 04:23:41.584003 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.583988 2574 factory.go:103] Registering Raw factory Apr 21 04:23:41.584003 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.584003 2574 manager.go:1196] Started watching for new ooms in manager Apr 21 04:23:41.584357 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:41.583033 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-163.ec2.internal.18a8448c1d213265 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-163.ec2.internal,UID:ip-10-0-140-163.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-163.ec2.internal,},FirstTimestamp:2026-04-21 04:23:41.572485733 +0000 UTC m=+0.432589709,LastTimestamp:2026-04-21 04:23:41.572485733 +0000 UTC m=+0.432589709,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-163.ec2.internal,}" Apr 21 04:23:41.584532 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.584396 2574 manager.go:319] Starting recovery of all containers Apr 21 04:23:41.587710 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:41.587666 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 04:23:41.593464 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:41.593442 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-163.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 04:23:41.593464 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:41.593443 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 04:23:41.593762 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.593664 2574 manager.go:324] Recovery completed Apr 21 04:23:41.595408 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.595389 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-j562f" Apr 21 04:23:41.598391 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.598375 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:23:41.600712 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.600698 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:23:41.600791 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.600744 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:23:41.600791 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.600758 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:23:41.601192 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.601179 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 04:23:41.601192 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.601190 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 04:23:41.601289 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.601213 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 21 04:23:41.602166 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.602153 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-j562f" Apr 21 04:23:41.603295 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.603281 2574 policy_none.go:49] "None policy: Start" Apr 21 04:23:41.603295 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.603297 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 04:23:41.603392 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.603306 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 21 04:23:41.645221 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.641288 2574 manager.go:341] "Starting Device Plugin manager" Apr 21 04:23:41.645221 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:41.641321 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 04:23:41.645221 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.641331 2574 server.go:85] "Starting device plugin registration server" Apr 21 04:23:41.645221 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.641571 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 04:23:41.645221 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.641581 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 04:23:41.645221 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.641700 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 04:23:41.645221 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.641828 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 04:23:41.645221 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.641839 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 04:23:41.645221 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:41.642308 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 04:23:41.645221 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:41.642345 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-163.ec2.internal\" not found" Apr 21 04:23:41.681276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.681245 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 04:23:41.682411 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.682397 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 04:23:41.682471 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.682424 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 04:23:41.682471 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.682444 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 04:23:41.682471 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.682450 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 04:23:41.682573 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:41.682486 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 04:23:41.684984 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.684959 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:23:41.742014 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.741929 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:23:41.742897 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.742880 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:23:41.742971 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.742916 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:23:41.742971 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.742928 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:23:41.742971 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.742951 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-163.ec2.internal" Apr 21 04:23:41.753542 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.753522 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-163.ec2.internal" Apr 21 04:23:41.753614 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:41.753544 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-163.ec2.internal\": node \"ip-10-0-140-163.ec2.internal\" not found" Apr 21 04:23:41.769017 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:41.768998 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-163.ec2.internal\" not found" Apr 21 04:23:41.783568 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.783551 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-163.ec2.internal"] Apr 21 04:23:41.783643 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.783612 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:23:41.784514 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.784498 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:23:41.784570 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.784527 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:23:41.784570 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.784537 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:23:41.785754 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.785743 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:23:41.785876 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.785858 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal" Apr 21 04:23:41.785913 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.785893 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:23:41.786422 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.786405 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:23:41.786422 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.786417 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:23:41.786518 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.786437 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:23:41.786518 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.786448 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:23:41.786518 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.786437 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:23:41.786605 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.786525 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:23:41.787700 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.787686 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-163.ec2.internal" Apr 21 04:23:41.787760 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.787714 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:23:41.788308 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.788295 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:23:41.788374 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.788322 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:23:41.788374 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.788338 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:23:41.824392 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:41.824370 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-163.ec2.internal\" not found" node="ip-10-0-140-163.ec2.internal" Apr 21 04:23:41.828909 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:41.828893 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-163.ec2.internal\" not found" node="ip-10-0-140-163.ec2.internal" Apr 21 04:23:41.869386 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:41.869357 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-163.ec2.internal\" not found" Apr 21 04:23:41.884551 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.884524 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/79e8d65e08ec631d5f830fe768b488cf-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal\" (UID: \"79e8d65e08ec631d5f830fe768b488cf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal" Apr 21 04:23:41.884618 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.884564 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/79e8d65e08ec631d5f830fe768b488cf-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal\" (UID: \"79e8d65e08ec631d5f830fe768b488cf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal" Apr 21 04:23:41.969495 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:41.969454 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-163.ec2.internal\" not found" Apr 21 04:23:41.984836 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.984811 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/79e8d65e08ec631d5f830fe768b488cf-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal\" (UID: \"79e8d65e08ec631d5f830fe768b488cf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal" Apr 21 04:23:41.984917 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.984842 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/79e8d65e08ec631d5f830fe768b488cf-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal\" (UID: \"79e8d65e08ec631d5f830fe768b488cf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal" Apr 21 04:23:41.984917 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.984869 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3f4b2ba79c5ad3c70dc6c2f14b4ac240-config\") pod \"kube-apiserver-proxy-ip-10-0-140-163.ec2.internal\" (UID: \"3f4b2ba79c5ad3c70dc6c2f14b4ac240\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-163.ec2.internal" Apr 21 04:23:41.984917 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.984911 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/79e8d65e08ec631d5f830fe768b488cf-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal\" (UID: \"79e8d65e08ec631d5f830fe768b488cf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal" Apr 21 04:23:41.985012 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:41.984929 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/79e8d65e08ec631d5f830fe768b488cf-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal\" (UID: \"79e8d65e08ec631d5f830fe768b488cf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal" Apr 21 04:23:42.070265 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:42.070198 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-163.ec2.internal\" not found" Apr 21 04:23:42.085503 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.085479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3f4b2ba79c5ad3c70dc6c2f14b4ac240-config\") pod \"kube-apiserver-proxy-ip-10-0-140-163.ec2.internal\" (UID: \"3f4b2ba79c5ad3c70dc6c2f14b4ac240\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-163.ec2.internal" Apr 21 04:23:42.085556 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.085528 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3f4b2ba79c5ad3c70dc6c2f14b4ac240-config\") pod \"kube-apiserver-proxy-ip-10-0-140-163.ec2.internal\" (UID: \"3f4b2ba79c5ad3c70dc6c2f14b4ac240\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-163.ec2.internal" Apr 21 04:23:42.126649 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.126622 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal" Apr 21 04:23:42.132021 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.132004 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-163.ec2.internal" Apr 21 04:23:42.171265 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:42.171235 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-163.ec2.internal\" not found" Apr 21 04:23:42.271819 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:42.271774 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-163.ec2.internal\" not found" Apr 21 04:23:42.372326 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:42.372236 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-163.ec2.internal\" not found" Apr 21 04:23:42.472796 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:42.472766 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-163.ec2.internal\" not found" Apr 21 04:23:42.485030 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.485008 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 04:23:42.485153 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.485144 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 04:23:42.573762 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:42.573737 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-163.ec2.internal\" not found" Apr 21 04:23:42.583576 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.583557 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 04:23:42.594236 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.594217 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 04:23:42.603883 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.603858 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 04:18:41 +0000 UTC" deadline="2028-01-06 07:11:29.544734348 +0000 UTC" Apr 21 04:23:42.603883 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.603880 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15002h47m46.940857012s" Apr 21 04:23:42.614609 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.614591 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-t98qv" Apr 21 04:23:42.622288 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.622273 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-t98qv" Apr 21 04:23:42.674716 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:42.674653 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-163.ec2.internal\" not found" Apr 21 04:23:42.696432 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.696412 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:23:42.752987 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.752960 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:23:42.783648 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.783623 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal" Apr 21 04:23:42.784095 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:42.784061 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79e8d65e08ec631d5f830fe768b488cf.slice/crio-c4472dc6941f000bf31297dbf880b201839a1b1003375a237f1f5b96b8be9fb7 WatchSource:0}: Error finding container c4472dc6941f000bf31297dbf880b201839a1b1003375a237f1f5b96b8be9fb7: Status 404 returned error can't find the container with id c4472dc6941f000bf31297dbf880b201839a1b1003375a237f1f5b96b8be9fb7 Apr 21 04:23:42.784447 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:42.784433 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f4b2ba79c5ad3c70dc6c2f14b4ac240.slice/crio-5240d5ab38b68673b15af46f86f11f875f253e3700dc023d180f7d9ddb5d8375 WatchSource:0}: Error finding container 5240d5ab38b68673b15af46f86f11f875f253e3700dc023d180f7d9ddb5d8375: Status 404 returned error can't find the container with id 5240d5ab38b68673b15af46f86f11f875f253e3700dc023d180f7d9ddb5d8375 Apr 21 04:23:42.790700 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.790159 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:23:42.795188 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.795164 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 04:23:42.795957 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.795943 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-163.ec2.internal" Apr 21 04:23:42.800591 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.800576 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:23:42.802889 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:42.802872 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 04:23:43.560889 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.560851 2574 apiserver.go:52] "Watching apiserver" Apr 21 04:23:43.569086 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.569059 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 04:23:43.570012 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.569579 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-77fkm","openshift-multus/multus-additional-cni-plugins-85nl4","openshift-network-diagnostics/network-check-target-wxgvr","openshift-network-operator/iptables-alerter-zc2r8","kube-system/konnectivity-agent-4zx8l","kube-system/kube-apiserver-proxy-ip-10-0-140-163.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal","openshift-multus/network-metrics-daemon-bwjm7","openshift-ovn-kubernetes/ovnkube-node-szzqq","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp","openshift-cluster-node-tuning-operator/tuned-xlmcp","openshift-dns/node-resolver-7b454","openshift-image-registry/node-ca-pqxgt"] Apr 21 04:23:43.571211 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.571183 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.573805 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.573783 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 04:23:43.573909 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.573885 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:23:43.573988 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:43.573970 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wxgvr" podUID="bf24d380-8460-46e7-b05d-3783cbfac351" Apr 21 04:23:43.574048 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.574006 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 04:23:43.574545 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.574526 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 04:23:43.574638 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.574591 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8jrn8\"" Apr 21 04:23:43.576945 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.575510 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.576945 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.576491 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.578862 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.577951 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4zx8l" Apr 21 04:23:43.578862 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.578439 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 04:23:43.578862 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.578490 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 04:23:43.578862 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.578701 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 04:23:43.578862 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.578711 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 04:23:43.578862 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.578744 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 04:23:43.579203 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.578910 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 04:23:43.579203 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.579090 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pqxgt" Apr 21 04:23:43.579203 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.579103 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-27vpl\"" Apr 21 04:23:43.579347 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.579293 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vfg2r\"" Apr 21 04:23:43.582343 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.579957 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 04:23:43.582343 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.580174 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7b454" Apr 21 04:23:43.582343 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.580204 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dqs72\"" Apr 21 04:23:43.582343 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.580517 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 04:23:43.582343 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.581414 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zc2r8" Apr 21 04:23:43.583282 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.583262 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.586382 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.585032 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:43.586382 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:43.585107 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:23:43.586382 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.585603 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.587987 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.588658 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.591670 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.591856 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.592006 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.592063 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.592130 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-7mcsw\"" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.592219 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.592791 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-x5h7q\"" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.592902 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-run-ovn-kubernetes\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.592924 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-h5tb7\"" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.592954 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-run-k8s-cni-cncf-io\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.592996 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-run-multus-certs\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.593032 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-nl5wz\"" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.593036 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1042efe6-4e47-4044-89ad-3285ece081ef-konnectivity-ca\") pod \"konnectivity-agent-4zx8l\" (UID: \"1042efe6-4e47-4044-89ad-3285ece081ef\") " pod="kube-system/konnectivity-agent-4zx8l" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.593065 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.593080 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-device-dir\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.593111 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-etc-selinux\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.593153 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-var-lib-kubelet\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.593186 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-kubelet\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.593221 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-slash\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.593350 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-run-netns\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.595276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.593377 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-log-socket\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.593430 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-cni-binary-copy\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.593916 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.593929 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.593961 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.593955 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-systemd\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594059 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-run-netns\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594097 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-var-lib-cni-bin\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594134 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-var-lib-cni-multus\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594178 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594302 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594180 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-etc-kubernetes\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594433 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-systemd-units\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594505 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-var-lib-openvswitch\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594623 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6af1b08-59ae-47ba-9588-6c079464ff78-ovnkube-config\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594675 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6af1b08-59ae-47ba-9588-6c079464ff78-ovn-node-metrics-cert\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594740 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b3ed1e6-f33c-4bff-99f0-e47538021110-cnibin\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594782 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1042efe6-4e47-4044-89ad-3285ece081ef-agent-certs\") pod \"konnectivity-agent-4zx8l\" (UID: \"1042efe6-4e47-4044-89ad-3285ece081ef\") " pod="kube-system/konnectivity-agent-4zx8l" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594806 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594812 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-etc-openvswitch\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.596540 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594857 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6af1b08-59ae-47ba-9588-6c079464ff78-ovnkube-script-lib\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594888 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b3ed1e6-f33c-4bff-99f0-e47538021110-system-cni-dir\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594747 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594931 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-multus-socket-dir-parent\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594972 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.594962 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595010 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d7bcf82-4223-4a89-a957-2d50a0af065e-hosts-file\") pod \"node-resolver-7b454\" (UID: \"9d7bcf82-4223-4a89-a957-2d50a0af065e\") " pod="openshift-dns/node-resolver-7b454" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595048 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d-iptables-alerter-script\") pod \"iptables-alerter-zc2r8\" (UID: \"0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d\") " pod="openshift-network-operator/iptables-alerter-zc2r8" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595084 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqjc6\" (UniqueName: \"kubernetes.io/projected/2b3ed1e6-f33c-4bff-99f0-e47538021110-kube-api-access-nqjc6\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595133 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-socket-dir\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595168 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-run\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595243 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2b3ed1e6-f33c-4bff-99f0-e47538021110-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595282 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jl4d4\"" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-multus-conf-dir\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595325 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5jtt\" (UniqueName: \"kubernetes.io/projected/c904be64-c3a8-4c8b-97c6-75f15dbf9670-kube-api-access-l5jtt\") pod \"network-metrics-daemon-bwjm7\" (UID: \"c904be64-c3a8-4c8b-97c6-75f15dbf9670\") " pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595370 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6af1b08-59ae-47ba-9588-6c079464ff78-env-overrides\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595460 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-system-cni-dir\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595498 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-os-release\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.597502 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595529 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-lib-modules\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.598254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595570 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtk7q\" (UniqueName: \"kubernetes.io/projected/29976138-ab3d-47e3-b7e7-973bdd4b5161-kube-api-access-wtk7q\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.598254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b3ed1e6-f33c-4bff-99f0-e47538021110-tuning-conf-dir\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.598254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595640 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbgjq\" (UniqueName: \"kubernetes.io/projected/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-kube-api-access-qbgjq\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.598254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595690 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59fe2a00-5ff4-4e40-b04e-88c5b7b9b743-host\") pod \"node-ca-pqxgt\" (UID: \"59fe2a00-5ff4-4e40-b04e-88c5b7b9b743\") " pod="openshift-image-registry/node-ca-pqxgt" Apr 21 04:23:43.598254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595739 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9d7bcf82-4223-4a89-a957-2d50a0af065e-tmp-dir\") pod \"node-resolver-7b454\" (UID: \"9d7bcf82-4223-4a89-a957-2d50a0af065e\") " pod="openshift-dns/node-resolver-7b454" Apr 21 04:23:43.598254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595780 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-kubernetes\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.598254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595841 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29976138-ab3d-47e3-b7e7-973bdd4b5161-tmp\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.598254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595872 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-run-systemd\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.598254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.595912 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-run-ovn\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.598254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.596100 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-var-lib-kubelet\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.598254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.596132 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-hostroot\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.598254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.596179 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-registration-dir\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.598254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.596210 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-sysconfig\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.598254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.596267 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-node-log\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.598254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.596299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b3ed1e6-f33c-4bff-99f0-e47538021110-os-release\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.598254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.596354 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-multus-daemon-config\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.598254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.596380 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-host\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.598986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.596473 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-cni-bin\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.598986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.596558 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-cni-netd\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.598986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.596621 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b3ed1e6-f33c-4bff-99f0-e47538021110-cni-binary-copy\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.598986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.596675 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2b3ed1e6-f33c-4bff-99f0-e47538021110-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.598986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.596745 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-cnibin\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.598986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.596902 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk6bf\" (UniqueName: \"kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf\") pod \"network-check-target-wxgvr\" (UID: \"bf24d380-8460-46e7-b05d-3783cbfac351\") " pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:23:43.598986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.596948 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/59fe2a00-5ff4-4e40-b04e-88c5b7b9b743-serviceca\") pod \"node-ca-pqxgt\" (UID: \"59fe2a00-5ff4-4e40-b04e-88c5b7b9b743\") " pod="openshift-image-registry/node-ca-pqxgt" Apr 21 04:23:43.598986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.597000 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-sys\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.598986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.597038 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grt2n\" (UniqueName: \"kubernetes.io/projected/0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d-kube-api-access-grt2n\") pod \"iptables-alerter-zc2r8\" (UID: \"0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d\") " pod="openshift-network-operator/iptables-alerter-zc2r8" Apr 21 04:23:43.598986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.597087 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-run-openvswitch\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.598986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.597124 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-modprobe-d\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.598986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.597195 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.598986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.597361 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7fzw\" (UniqueName: \"kubernetes.io/projected/c6af1b08-59ae-47ba-9588-6c079464ff78-kube-api-access-j7fzw\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.598986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.597411 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-sys-fs\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.598986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.597452 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjp97\" (UniqueName: \"kubernetes.io/projected/b5e5ea7e-ba73-49e1-947a-a711cd272c01-kube-api-access-jjp97\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.598986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.597544 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgsmz\" (UniqueName: \"kubernetes.io/projected/59fe2a00-5ff4-4e40-b04e-88c5b7b9b743-kube-api-access-qgsmz\") pod \"node-ca-pqxgt\" (UID: \"59fe2a00-5ff4-4e40-b04e-88c5b7b9b743\") " pod="openshift-image-registry/node-ca-pqxgt" Apr 21 04:23:43.599829 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.597583 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-sysctl-d\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.599829 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.599312 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-tuned\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.599829 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.599342 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngh4g\" (UniqueName: \"kubernetes.io/projected/9d7bcf82-4223-4a89-a957-2d50a0af065e-kube-api-access-ngh4g\") pod \"node-resolver-7b454\" (UID: \"9d7bcf82-4223-4a89-a957-2d50a0af065e\") " pod="openshift-dns/node-resolver-7b454" Apr 21 04:23:43.599829 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.599368 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-sysctl-conf\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.599829 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.599390 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d-host-slash\") pod \"iptables-alerter-zc2r8\" (UID: \"0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d\") " pod="openshift-network-operator/iptables-alerter-zc2r8" Apr 21 04:23:43.599829 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.599413 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-multus-cni-dir\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.599829 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.599435 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs\") pod \"network-metrics-daemon-bwjm7\" (UID: \"c904be64-c3a8-4c8b-97c6-75f15dbf9670\") " pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:43.613062 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.613040 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:23:43.624169 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.624143 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 04:18:42 +0000 UTC" deadline="2028-01-12 14:53:54.54013021 +0000 UTC" Apr 21 04:23:43.624169 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.624169 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15154h30m10.915963863s" Apr 21 04:23:43.684650 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.684624 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 04:23:43.686676 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.686632 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal" event={"ID":"79e8d65e08ec631d5f830fe768b488cf","Type":"ContainerStarted","Data":"c4472dc6941f000bf31297dbf880b201839a1b1003375a237f1f5b96b8be9fb7"} Apr 21 04:23:43.688024 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.687994 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-163.ec2.internal" event={"ID":"3f4b2ba79c5ad3c70dc6c2f14b4ac240","Type":"ContainerStarted","Data":"5240d5ab38b68673b15af46f86f11f875f253e3700dc023d180f7d9ddb5d8375"} Apr 21 04:23:43.700061 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700038 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-var-lib-cni-multus\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.700186 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700073 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-etc-kubernetes\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.700186 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700141 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-systemd-units\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.700186 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700155 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-var-lib-cni-multus\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.700337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700183 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-var-lib-openvswitch\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.700337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700236 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-etc-kubernetes\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.700337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700239 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6af1b08-59ae-47ba-9588-6c079464ff78-ovnkube-config\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.700337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700283 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-var-lib-openvswitch\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.700337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700295 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6af1b08-59ae-47ba-9588-6c079464ff78-ovn-node-metrics-cert\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.700337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700315 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-systemd-units\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.700602 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700359 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b3ed1e6-f33c-4bff-99f0-e47538021110-cnibin\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.700602 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700323 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b3ed1e6-f33c-4bff-99f0-e47538021110-cnibin\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.700602 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700406 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1042efe6-4e47-4044-89ad-3285ece081ef-agent-certs\") pod \"konnectivity-agent-4zx8l\" (UID: \"1042efe6-4e47-4044-89ad-3285ece081ef\") " pod="kube-system/konnectivity-agent-4zx8l" Apr 21 04:23:43.700602 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700433 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-etc-openvswitch\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.700602 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700458 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6af1b08-59ae-47ba-9588-6c079464ff78-ovnkube-script-lib\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.700602 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700486 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b3ed1e6-f33c-4bff-99f0-e47538021110-system-cni-dir\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.700602 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700513 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-multus-socket-dir-parent\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.700602 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700519 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-etc-openvswitch\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.700602 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700541 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.700602 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700578 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d7bcf82-4223-4a89-a957-2d50a0af065e-hosts-file\") pod \"node-resolver-7b454\" (UID: \"9d7bcf82-4223-4a89-a957-2d50a0af065e\") " pod="openshift-dns/node-resolver-7b454" Apr 21 04:23:43.700602 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700569 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b3ed1e6-f33c-4bff-99f0-e47538021110-system-cni-dir\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.701124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700607 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d-iptables-alerter-script\") pod \"iptables-alerter-zc2r8\" (UID: \"0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d\") " pod="openshift-network-operator/iptables-alerter-zc2r8" Apr 21 04:23:43.701124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700634 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqjc6\" (UniqueName: \"kubernetes.io/projected/2b3ed1e6-f33c-4bff-99f0-e47538021110-kube-api-access-nqjc6\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.701124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700677 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 04:23:43.701124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700688 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-socket-dir\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.701124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700698 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-multus-socket-dir-parent\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.701124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700714 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-run\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.701124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700637 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.701124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700745 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d7bcf82-4223-4a89-a957-2d50a0af065e-hosts-file\") pod \"node-resolver-7b454\" (UID: \"9d7bcf82-4223-4a89-a957-2d50a0af065e\") " pod="openshift-dns/node-resolver-7b454" Apr 21 04:23:43.701124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700760 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2b3ed1e6-f33c-4bff-99f0-e47538021110-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.701124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700786 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-multus-conf-dir\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.701124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700807 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5jtt\" (UniqueName: \"kubernetes.io/projected/c904be64-c3a8-4c8b-97c6-75f15dbf9670-kube-api-access-l5jtt\") pod \"network-metrics-daemon-bwjm7\" (UID: \"c904be64-c3a8-4c8b-97c6-75f15dbf9670\") " pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:43.701124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700830 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6af1b08-59ae-47ba-9588-6c079464ff78-env-overrides\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.701124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700851 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-system-cni-dir\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.701124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700857 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-socket-dir\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.701124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700870 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-os-release\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.701124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700893 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-lib-modules\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.701124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700922 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtk7q\" (UniqueName: \"kubernetes.io/projected/29976138-ab3d-47e3-b7e7-973bdd4b5161-kube-api-access-wtk7q\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700946 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b3ed1e6-f33c-4bff-99f0-e47538021110-tuning-conf-dir\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700963 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c6af1b08-59ae-47ba-9588-6c079464ff78-ovnkube-config\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.700972 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbgjq\" (UniqueName: \"kubernetes.io/projected/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-kube-api-access-qbgjq\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701269 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59fe2a00-5ff4-4e40-b04e-88c5b7b9b743-host\") pod \"node-ca-pqxgt\" (UID: \"59fe2a00-5ff4-4e40-b04e-88c5b7b9b743\") " pod="openshift-image-registry/node-ca-pqxgt" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701296 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9d7bcf82-4223-4a89-a957-2d50a0af065e-tmp-dir\") pod \"node-resolver-7b454\" (UID: \"9d7bcf82-4223-4a89-a957-2d50a0af065e\") " pod="openshift-dns/node-resolver-7b454" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701323 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-kubernetes\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701350 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29976138-ab3d-47e3-b7e7-973bdd4b5161-tmp\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701376 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-run-systemd\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701401 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-run-ovn\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701425 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-var-lib-kubelet\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701456 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-hostroot\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701482 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c6af1b08-59ae-47ba-9588-6c079464ff78-ovnkube-script-lib\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701489 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-os-release\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701365 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c6af1b08-59ae-47ba-9588-6c079464ff78-env-overrides\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701484 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-registration-dir\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701559 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-run\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701597 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-sysconfig\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.701967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701610 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-multus-conf-dir\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.702792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701629 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-node-log\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.702792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701657 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b3ed1e6-f33c-4bff-99f0-e47538021110-os-release\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.702792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701687 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-multus-daemon-config\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.702792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701713 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-host\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.702792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701759 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-cni-bin\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.702792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701773 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b3ed1e6-f33c-4bff-99f0-e47538021110-tuning-conf-dir\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.702792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701787 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-cni-netd\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.702792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701830 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b3ed1e6-f33c-4bff-99f0-e47538021110-cni-binary-copy\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.702792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701841 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-cni-netd\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.702792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701862 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-run-ovn\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.702792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701868 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2b3ed1e6-f33c-4bff-99f0-e47538021110-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.702792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701878 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-var-lib-kubelet\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.702792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701427 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-lib-modules\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.702792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701321 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d-iptables-alerter-script\") pod \"iptables-alerter-zc2r8\" (UID: \"0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d\") " pod="openshift-network-operator/iptables-alerter-zc2r8" Apr 21 04:23:43.702792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701527 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59fe2a00-5ff4-4e40-b04e-88c5b7b9b743-host\") pod \"node-ca-pqxgt\" (UID: \"59fe2a00-5ff4-4e40-b04e-88c5b7b9b743\") " pod="openshift-image-registry/node-ca-pqxgt" Apr 21 04:23:43.702792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701831 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9d7bcf82-4223-4a89-a957-2d50a0af065e-tmp-dir\") pod \"node-resolver-7b454\" (UID: \"9d7bcf82-4223-4a89-a957-2d50a0af065e\") " pod="openshift-dns/node-resolver-7b454" Apr 21 04:23:43.702792 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701950 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-system-cni-dir\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.703521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.701981 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-registration-dir\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.703521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702015 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-node-log\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.703521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702020 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-run-systemd\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.703521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702042 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b3ed1e6-f33c-4bff-99f0-e47538021110-os-release\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.703521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702059 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-cnibin\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.703521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702082 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-host\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.703521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702089 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk6bf\" (UniqueName: \"kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf\") pod \"network-check-target-wxgvr\" (UID: \"bf24d380-8460-46e7-b05d-3783cbfac351\") " pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:23:43.703521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702089 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2b3ed1e6-f33c-4bff-99f0-e47538021110-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.703521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702112 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-cni-bin\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.703521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702130 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-hostroot\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.703521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702167 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/59fe2a00-5ff4-4e40-b04e-88c5b7b9b743-serviceca\") pod \"node-ca-pqxgt\" (UID: \"59fe2a00-5ff4-4e40-b04e-88c5b7b9b743\") " pod="openshift-image-registry/node-ca-pqxgt" Apr 21 04:23:43.703521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702169 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-kubernetes\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.703521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702229 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-sys\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.703521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702255 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grt2n\" (UniqueName: \"kubernetes.io/projected/0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d-kube-api-access-grt2n\") pod \"iptables-alerter-zc2r8\" (UID: \"0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d\") " pod="openshift-network-operator/iptables-alerter-zc2r8" Apr 21 04:23:43.703521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702280 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-run-openvswitch\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.703521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702307 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b3ed1e6-f33c-4bff-99f0-e47538021110-cni-binary-copy\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.703521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702332 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-modprobe-d\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.704337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702358 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.704337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702382 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7fzw\" (UniqueName: \"kubernetes.io/projected/c6af1b08-59ae-47ba-9588-6c079464ff78-kube-api-access-j7fzw\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.704337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702410 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-sys-fs\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.704337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702435 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjp97\" (UniqueName: \"kubernetes.io/projected/b5e5ea7e-ba73-49e1-947a-a711cd272c01-kube-api-access-jjp97\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.704337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702508 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgsmz\" (UniqueName: \"kubernetes.io/projected/59fe2a00-5ff4-4e40-b04e-88c5b7b9b743-kube-api-access-qgsmz\") pod \"node-ca-pqxgt\" (UID: \"59fe2a00-5ff4-4e40-b04e-88c5b7b9b743\") " pod="openshift-image-registry/node-ca-pqxgt" Apr 21 04:23:43.704337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702533 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-sysctl-d\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.704337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702542 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-multus-daemon-config\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.704337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702559 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-tuned\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.704337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702589 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngh4g\" (UniqueName: \"kubernetes.io/projected/9d7bcf82-4223-4a89-a957-2d50a0af065e-kube-api-access-ngh4g\") pod \"node-resolver-7b454\" (UID: \"9d7bcf82-4223-4a89-a957-2d50a0af065e\") " pod="openshift-dns/node-resolver-7b454" Apr 21 04:23:43.704337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702616 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-sysctl-conf\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.704337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702624 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.704337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.703177 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d-host-slash\") pod \"iptables-alerter-zc2r8\" (UID: \"0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d\") " pod="openshift-network-operator/iptables-alerter-zc2r8" Apr 21 04:23:43.704337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.703367 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-sys\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.704337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702625 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/59fe2a00-5ff4-4e40-b04e-88c5b7b9b743-serviceca\") pod \"node-ca-pqxgt\" (UID: \"59fe2a00-5ff4-4e40-b04e-88c5b7b9b743\") " pod="openshift-image-registry/node-ca-pqxgt" Apr 21 04:23:43.704337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.703751 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-sys-fs\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.704337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.703964 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-sysctl-conf\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.705037 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.704430 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-sysctl-d\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.705037 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.704527 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-multus-cni-dir\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.705037 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.704572 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs\") pod \"network-metrics-daemon-bwjm7\" (UID: \"c904be64-c3a8-4c8b-97c6-75f15dbf9670\") " pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:43.705037 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.704652 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-run-ovn-kubernetes\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.705037 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.704690 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-run-k8s-cni-cncf-io\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.705037 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.702591 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-run-openvswitch\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.705312 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.705170 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-run-multus-certs\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.705312 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.705204 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1042efe6-4e47-4044-89ad-3285ece081ef-konnectivity-ca\") pod \"konnectivity-agent-4zx8l\" (UID: \"1042efe6-4e47-4044-89ad-3285ece081ef\") " pod="kube-system/konnectivity-agent-4zx8l" Apr 21 04:23:43.705312 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.705241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-device-dir\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.705312 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.705274 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-etc-selinux\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.705312 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.705290 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c6af1b08-59ae-47ba-9588-6c079464ff78-ovn-node-metrics-cert\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.705312 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.705308 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-var-lib-kubelet\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.705583 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.705327 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2b3ed1e6-f33c-4bff-99f0-e47538021110-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.705583 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.705343 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-kubelet\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.705583 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.705405 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-slash\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.705583 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.705414 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-kubelet\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.705583 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:43.705436 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:43.705583 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.705447 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-run-netns\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.705583 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.705485 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-run-k8s-cni-cncf-io\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.705932 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:43.705913 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs podName:c904be64-c3a8-4c8b-97c6-75f15dbf9670 nodeName:}" failed. No retries permitted until 2026-04-21 04:23:44.205891036 +0000 UTC m=+3.065995018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs") pod "network-metrics-daemon-bwjm7" (UID: "c904be64-c3a8-4c8b-97c6-75f15dbf9670") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706226 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-sysconfig\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706252 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-var-lib-kubelet\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706283 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-etc-selinux\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706287 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-modprobe-d\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706345 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-slash\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706376 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1042efe6-4e47-4044-89ad-3285ece081ef-konnectivity-ca\") pod \"konnectivity-agent-4zx8l\" (UID: \"1042efe6-4e47-4044-89ad-3285ece081ef\") " pod="kube-system/konnectivity-agent-4zx8l" Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706380 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-log-socket\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706439 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b5e5ea7e-ba73-49e1-947a-a711cd272c01-device-dir\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706480 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-run-ovn-kubernetes\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706509 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-cni-binary-copy\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706512 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-host-run-netns\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706562 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d-host-slash\") pod \"iptables-alerter-zc2r8\" (UID: \"0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d\") " pod="openshift-network-operator/iptables-alerter-zc2r8" Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706564 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-run-multus-certs\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706604 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c6af1b08-59ae-47ba-9588-6c079464ff78-log-socket\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706563 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-systemd\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706626 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-multus-cni-dir\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706657 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-run-netns\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.708537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706691 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-var-lib-cni-bin\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.709432 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706749 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-var-lib-cni-bin\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.709432 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706758 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-systemd\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.709432 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706803 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-host-run-netns\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.709432 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.706928 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-cnibin\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.709432 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.707190 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-cni-binary-copy\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.709432 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:43.708250 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:23:43.709432 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:43.708267 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:23:43.709432 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:43.708281 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nk6bf for pod openshift-network-diagnostics/network-check-target-wxgvr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:43.709432 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:43.708353 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf podName:bf24d380-8460-46e7-b05d-3783cbfac351 nodeName:}" failed. No retries permitted until 2026-04-21 04:23:44.208334355 +0000 UTC m=+3.068438331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nk6bf" (UniqueName: "kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf") pod "network-check-target-wxgvr" (UID: "bf24d380-8460-46e7-b05d-3783cbfac351") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:43.709432 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.708742 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1042efe6-4e47-4044-89ad-3285ece081ef-agent-certs\") pod \"konnectivity-agent-4zx8l\" (UID: \"1042efe6-4e47-4044-89ad-3285ece081ef\") " pod="kube-system/konnectivity-agent-4zx8l" Apr 21 04:23:43.711798 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.711774 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtk7q\" (UniqueName: \"kubernetes.io/projected/29976138-ab3d-47e3-b7e7-973bdd4b5161-kube-api-access-wtk7q\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.713143 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.712590 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29976138-ab3d-47e3-b7e7-973bdd4b5161-tmp\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.714665 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.714638 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgsmz\" (UniqueName: \"kubernetes.io/projected/59fe2a00-5ff4-4e40-b04e-88c5b7b9b743-kube-api-access-qgsmz\") pod \"node-ca-pqxgt\" (UID: \"59fe2a00-5ff4-4e40-b04e-88c5b7b9b743\") " pod="openshift-image-registry/node-ca-pqxgt" Apr 21 04:23:43.714803 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.714784 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqjc6\" (UniqueName: \"kubernetes.io/projected/2b3ed1e6-f33c-4bff-99f0-e47538021110-kube-api-access-nqjc6\") pod \"multus-additional-cni-plugins-85nl4\" (UID: \"2b3ed1e6-f33c-4bff-99f0-e47538021110\") " pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.714880 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.714815 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5jtt\" (UniqueName: \"kubernetes.io/projected/c904be64-c3a8-4c8b-97c6-75f15dbf9670-kube-api-access-l5jtt\") pod \"network-metrics-daemon-bwjm7\" (UID: \"c904be64-c3a8-4c8b-97c6-75f15dbf9670\") " pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:43.715305 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.715241 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbgjq\" (UniqueName: \"kubernetes.io/projected/a70a5ac4-3cb0-4b1d-b4c0-243990f39d76-kube-api-access-qbgjq\") pod \"multus-77fkm\" (UID: \"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76\") " pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.715449 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.715432 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7fzw\" (UniqueName: \"kubernetes.io/projected/c6af1b08-59ae-47ba-9588-6c079464ff78-kube-api-access-j7fzw\") pod \"ovnkube-node-szzqq\" (UID: \"c6af1b08-59ae-47ba-9588-6c079464ff78\") " pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.715519 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.715449 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngh4g\" (UniqueName: \"kubernetes.io/projected/9d7bcf82-4223-4a89-a957-2d50a0af065e-kube-api-access-ngh4g\") pod \"node-resolver-7b454\" (UID: \"9d7bcf82-4223-4a89-a957-2d50a0af065e\") " pod="openshift-dns/node-resolver-7b454" Apr 21 04:23:43.717414 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.717357 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grt2n\" (UniqueName: \"kubernetes.io/projected/0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d-kube-api-access-grt2n\") pod \"iptables-alerter-zc2r8\" (UID: \"0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d\") " pod="openshift-network-operator/iptables-alerter-zc2r8" Apr 21 04:23:43.718322 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.718305 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjp97\" (UniqueName: \"kubernetes.io/projected/b5e5ea7e-ba73-49e1-947a-a711cd272c01-kube-api-access-jjp97\") pod \"aws-ebs-csi-driver-node-xsbnp\" (UID: \"b5e5ea7e-ba73-49e1-947a-a711cd272c01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.718410 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.718385 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/29976138-ab3d-47e3-b7e7-973bdd4b5161-etc-tuned\") pod \"tuned-xlmcp\" (UID: \"29976138-ab3d-47e3-b7e7-973bdd4b5161\") " pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:43.892905 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.892822 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" Apr 21 04:23:43.907548 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.907516 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-85nl4" Apr 21 04:23:43.917257 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.917231 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-77fkm" Apr 21 04:23:43.923915 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.923894 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4zx8l" Apr 21 04:23:43.932471 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.932449 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pqxgt" Apr 21 04:23:43.939030 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.939014 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7b454" Apr 21 04:23:43.946526 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.946504 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zc2r8" Apr 21 04:23:43.954124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.954106 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:23:43.959678 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:43.959658 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" Apr 21 04:23:44.209997 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:44.209909 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk6bf\" (UniqueName: \"kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf\") pod \"network-check-target-wxgvr\" (UID: \"bf24d380-8460-46e7-b05d-3783cbfac351\") " pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:23:44.209997 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:44.209959 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs\") pod \"network-metrics-daemon-bwjm7\" (UID: \"c904be64-c3a8-4c8b-97c6-75f15dbf9670\") " pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:44.210226 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:44.210084 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:44.210226 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:44.210098 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:23:44.210226 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:44.210120 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:23:44.210226 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:44.210133 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nk6bf for pod openshift-network-diagnostics/network-check-target-wxgvr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:44.210226 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:44.210139 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs podName:c904be64-c3a8-4c8b-97c6-75f15dbf9670 nodeName:}" failed. No retries permitted until 2026-04-21 04:23:45.210119835 +0000 UTC m=+4.070223813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs") pod "network-metrics-daemon-bwjm7" (UID: "c904be64-c3a8-4c8b-97c6-75f15dbf9670") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:44.210226 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:44.210184 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf podName:bf24d380-8460-46e7-b05d-3783cbfac351 nodeName:}" failed. No retries permitted until 2026-04-21 04:23:45.210167889 +0000 UTC m=+4.070271893 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-nk6bf" (UniqueName: "kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf") pod "network-check-target-wxgvr" (UID: "bf24d380-8460-46e7-b05d-3783cbfac351") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:44.525520 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:44.525425 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda70a5ac4_3cb0_4b1d_b4c0_243990f39d76.slice/crio-d0f0cbf6af06a7207d4951e36cd0528890f0c7cdaba7f3a474189bf84b677345 WatchSource:0}: Error finding container d0f0cbf6af06a7207d4951e36cd0528890f0c7cdaba7f3a474189bf84b677345: Status 404 returned error can't find the container with id d0f0cbf6af06a7207d4951e36cd0528890f0c7cdaba7f3a474189bf84b677345 Apr 21 04:23:44.527088 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:44.527060 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59fe2a00_5ff4_4e40_b04e_88c5b7b9b743.slice/crio-29ce5e46326e55eca66c77d3dbc32b2106f9a022191ebf73fb4f111f1000e6d7 WatchSource:0}: Error finding container 29ce5e46326e55eca66c77d3dbc32b2106f9a022191ebf73fb4f111f1000e6d7: Status 404 returned error can't find the container with id 29ce5e46326e55eca66c77d3dbc32b2106f9a022191ebf73fb4f111f1000e6d7 Apr 21 04:23:44.528495 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:44.528468 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1042efe6_4e47_4044_89ad_3285ece081ef.slice/crio-efb731fdbc6298b36f5c778330fe08b59ade199f7c000d7344f4f3ac478aa4e0 WatchSource:0}: Error finding container efb731fdbc6298b36f5c778330fe08b59ade199f7c000d7344f4f3ac478aa4e0: Status 404 returned error can't find the container with id efb731fdbc6298b36f5c778330fe08b59ade199f7c000d7344f4f3ac478aa4e0 Apr 21 04:23:44.530302 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:44.530280 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a3bcd90_ef3a_4aec_bb35_a5dfcb101f0d.slice/crio-3f73485d260413effd8a597d322f469e4df92926220cfdd6cbdf6aa3ac685c39 WatchSource:0}: Error finding container 3f73485d260413effd8a597d322f469e4df92926220cfdd6cbdf6aa3ac685c39: Status 404 returned error can't find the container with id 3f73485d260413effd8a597d322f469e4df92926220cfdd6cbdf6aa3ac685c39 Apr 21 04:23:44.531963 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:44.531946 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d7bcf82_4223_4a89_a957_2d50a0af065e.slice/crio-ff394bbfb2d7e6732d511fd963870a9787837b587d290653af087af91b22d369 WatchSource:0}: Error finding container ff394bbfb2d7e6732d511fd963870a9787837b587d290653af087af91b22d369: Status 404 returned error can't find the container with id ff394bbfb2d7e6732d511fd963870a9787837b587d290653af087af91b22d369 Apr 21 04:23:44.533707 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:44.533499 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6af1b08_59ae_47ba_9588_6c079464ff78.slice/crio-22d20312d1d9e71b567b6931b7cab2a8a2e6dd4be6fd05d488d41761d47fc9a4 WatchSource:0}: Error finding container 22d20312d1d9e71b567b6931b7cab2a8a2e6dd4be6fd05d488d41761d47fc9a4: Status 404 returned error can't find the container with id 22d20312d1d9e71b567b6931b7cab2a8a2e6dd4be6fd05d488d41761d47fc9a4 Apr 21 04:23:44.534305 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:44.534290 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29976138_ab3d_47e3_b7e7_973bdd4b5161.slice/crio-3e9ac7a26a5109030aab5d8b1d7df0b7fa66cfe690052622f783d50104979e55 WatchSource:0}: Error finding container 3e9ac7a26a5109030aab5d8b1d7df0b7fa66cfe690052622f783d50104979e55: Status 404 returned error can't find the container with id 3e9ac7a26a5109030aab5d8b1d7df0b7fa66cfe690052622f783d50104979e55 Apr 21 04:23:44.535488 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:44.535379 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b3ed1e6_f33c_4bff_99f0_e47538021110.slice/crio-008dcffcfc1038384d7074d88a0c1f0994ad4af59d5b879ff31155325e9389e8 WatchSource:0}: Error finding container 008dcffcfc1038384d7074d88a0c1f0994ad4af59d5b879ff31155325e9389e8: Status 404 returned error can't find the container with id 008dcffcfc1038384d7074d88a0c1f0994ad4af59d5b879ff31155325e9389e8 Apr 21 04:23:44.536990 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:23:44.536889 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5e5ea7e_ba73_49e1_947a_a711cd272c01.slice/crio-12537b590da44d9d50b935d1d5ae953ccf510bf1f6229d4718382b1868f48b2a WatchSource:0}: Error finding container 12537b590da44d9d50b935d1d5ae953ccf510bf1f6229d4718382b1868f48b2a: Status 404 returned error can't find the container with id 12537b590da44d9d50b935d1d5ae953ccf510bf1f6229d4718382b1868f48b2a Apr 21 04:23:44.624657 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:44.624615 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 04:18:42 +0000 UTC" deadline="2027-12-02 00:03:59.241836028 +0000 UTC" Apr 21 04:23:44.624657 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:44.624654 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14155h40m14.617184611s" Apr 21 04:23:44.683521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:44.683492 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:44.683681 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:44.683606 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:23:44.690706 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:44.690679 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" event={"ID":"c6af1b08-59ae-47ba-9588-6c079464ff78","Type":"ContainerStarted","Data":"22d20312d1d9e71b567b6931b7cab2a8a2e6dd4be6fd05d488d41761d47fc9a4"} Apr 21 04:23:44.691768 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:44.691742 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zc2r8" event={"ID":"0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d","Type":"ContainerStarted","Data":"3f73485d260413effd8a597d322f469e4df92926220cfdd6cbdf6aa3ac685c39"} Apr 21 04:23:44.692763 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:44.692743 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pqxgt" event={"ID":"59fe2a00-5ff4-4e40-b04e-88c5b7b9b743","Type":"ContainerStarted","Data":"29ce5e46326e55eca66c77d3dbc32b2106f9a022191ebf73fb4f111f1000e6d7"} Apr 21 04:23:44.693713 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:44.693690 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-77fkm" event={"ID":"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76","Type":"ContainerStarted","Data":"d0f0cbf6af06a7207d4951e36cd0528890f0c7cdaba7f3a474189bf84b677345"} Apr 21 04:23:44.695292 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:44.695270 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-163.ec2.internal" event={"ID":"3f4b2ba79c5ad3c70dc6c2f14b4ac240","Type":"ContainerStarted","Data":"451b2ccc1c3eeca6f007ad2ad17ee6bde8dc47ef4c98a4930137e13cb6a5329e"} Apr 21 04:23:44.696333 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:44.696305 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7b454" event={"ID":"9d7bcf82-4223-4a89-a957-2d50a0af065e","Type":"ContainerStarted","Data":"ff394bbfb2d7e6732d511fd963870a9787837b587d290653af087af91b22d369"} Apr 21 04:23:44.697393 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:44.697370 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4zx8l" event={"ID":"1042efe6-4e47-4044-89ad-3285ece081ef","Type":"ContainerStarted","Data":"efb731fdbc6298b36f5c778330fe08b59ade199f7c000d7344f4f3ac478aa4e0"} Apr 21 04:23:44.698872 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:44.698614 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" event={"ID":"b5e5ea7e-ba73-49e1-947a-a711cd272c01","Type":"ContainerStarted","Data":"12537b590da44d9d50b935d1d5ae953ccf510bf1f6229d4718382b1868f48b2a"} Apr 21 04:23:44.699756 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:44.699713 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-85nl4" event={"ID":"2b3ed1e6-f33c-4bff-99f0-e47538021110","Type":"ContainerStarted","Data":"008dcffcfc1038384d7074d88a0c1f0994ad4af59d5b879ff31155325e9389e8"} Apr 21 04:23:44.700578 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:44.700560 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" event={"ID":"29976138-ab3d-47e3-b7e7-973bdd4b5161","Type":"ContainerStarted","Data":"3e9ac7a26a5109030aab5d8b1d7df0b7fa66cfe690052622f783d50104979e55"} Apr 21 04:23:44.707098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:44.707060 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-163.ec2.internal" podStartSLOduration=2.707047208 podStartE2EDuration="2.707047208s" podCreationTimestamp="2026-04-21 04:23:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:23:44.706492186 +0000 UTC m=+3.566596166" watchObservedRunningTime="2026-04-21 04:23:44.707047208 +0000 UTC m=+3.567151187" Apr 21 04:23:45.220108 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:45.219536 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk6bf\" (UniqueName: \"kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf\") pod \"network-check-target-wxgvr\" (UID: \"bf24d380-8460-46e7-b05d-3783cbfac351\") " pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:23:45.220108 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:45.219585 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs\") pod \"network-metrics-daemon-bwjm7\" (UID: \"c904be64-c3a8-4c8b-97c6-75f15dbf9670\") " pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:45.220108 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:45.219735 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:23:45.220108 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:45.219759 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:23:45.220108 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:45.219774 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nk6bf for pod openshift-network-diagnostics/network-check-target-wxgvr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:45.220108 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:45.219834 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf podName:bf24d380-8460-46e7-b05d-3783cbfac351 nodeName:}" failed. No retries permitted until 2026-04-21 04:23:47.219814681 +0000 UTC m=+6.079918639 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-nk6bf" (UniqueName: "kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf") pod "network-check-target-wxgvr" (UID: "bf24d380-8460-46e7-b05d-3783cbfac351") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:45.220108 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:45.219740 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:45.220108 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:45.220056 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs podName:c904be64-c3a8-4c8b-97c6-75f15dbf9670 nodeName:}" failed. No retries permitted until 2026-04-21 04:23:47.220039776 +0000 UTC m=+6.080143740 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs") pod "network-metrics-daemon-bwjm7" (UID: "c904be64-c3a8-4c8b-97c6-75f15dbf9670") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:45.688048 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:45.688012 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:23:45.688512 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:45.688178 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wxgvr" podUID="bf24d380-8460-46e7-b05d-3783cbfac351" Apr 21 04:23:45.728815 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:45.728775 2574 generic.go:358] "Generic (PLEG): container finished" podID="79e8d65e08ec631d5f830fe768b488cf" containerID="503fe5e787211202f79bc76ee6cfa6110dc8c65658ce181aa729772c425cac9a" exitCode=0 Apr 21 04:23:45.728976 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:45.728947 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal" event={"ID":"79e8d65e08ec631d5f830fe768b488cf","Type":"ContainerDied","Data":"503fe5e787211202f79bc76ee6cfa6110dc8c65658ce181aa729772c425cac9a"} Apr 21 04:23:46.683751 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:46.683705 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:46.683935 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:46.683865 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:23:46.748306 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:46.747609 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal" event={"ID":"79e8d65e08ec631d5f830fe768b488cf","Type":"ContainerStarted","Data":"0c6442cb39b81366322ae30f86dc85a0e11cf78f6374a498582d92848af36b98"} Apr 21 04:23:47.237181 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:47.237140 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk6bf\" (UniqueName: \"kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf\") pod \"network-check-target-wxgvr\" (UID: \"bf24d380-8460-46e7-b05d-3783cbfac351\") " pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:23:47.237358 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:47.237204 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs\") pod \"network-metrics-daemon-bwjm7\" (UID: \"c904be64-c3a8-4c8b-97c6-75f15dbf9670\") " pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:47.237358 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:47.237314 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:47.237528 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:47.237375 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs podName:c904be64-c3a8-4c8b-97c6-75f15dbf9670 nodeName:}" failed. No retries permitted until 2026-04-21 04:23:51.237357633 +0000 UTC m=+10.097461599 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs") pod "network-metrics-daemon-bwjm7" (UID: "c904be64-c3a8-4c8b-97c6-75f15dbf9670") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:47.237777 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:47.237311 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:23:47.237777 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:47.237651 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:23:47.237777 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:47.237670 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nk6bf for pod openshift-network-diagnostics/network-check-target-wxgvr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:47.237777 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:47.237744 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf podName:bf24d380-8460-46e7-b05d-3783cbfac351 nodeName:}" failed. No retries permitted until 2026-04-21 04:23:51.237707767 +0000 UTC m=+10.097811747 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-nk6bf" (UniqueName: "kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf") pod "network-check-target-wxgvr" (UID: "bf24d380-8460-46e7-b05d-3783cbfac351") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:47.683189 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:47.683090 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:23:47.683396 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:47.683224 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wxgvr" podUID="bf24d380-8460-46e7-b05d-3783cbfac351" Apr 21 04:23:48.682933 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:48.682898 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:48.683434 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:48.683047 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:23:49.682874 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:49.682824 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:23:49.683093 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:49.682981 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wxgvr" podUID="bf24d380-8460-46e7-b05d-3783cbfac351" Apr 21 04:23:50.683478 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:50.683439 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:50.684051 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:50.683593 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:23:51.274519 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:51.274475 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk6bf\" (UniqueName: \"kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf\") pod \"network-check-target-wxgvr\" (UID: \"bf24d380-8460-46e7-b05d-3783cbfac351\") " pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:23:51.274713 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:51.274541 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs\") pod \"network-metrics-daemon-bwjm7\" (UID: \"c904be64-c3a8-4c8b-97c6-75f15dbf9670\") " pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:51.274713 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:51.274689 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:51.274866 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:51.274775 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs podName:c904be64-c3a8-4c8b-97c6-75f15dbf9670 nodeName:}" failed. No retries permitted until 2026-04-21 04:23:59.274753789 +0000 UTC m=+18.134857748 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs") pod "network-metrics-daemon-bwjm7" (UID: "c904be64-c3a8-4c8b-97c6-75f15dbf9670") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:51.275255 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:51.275138 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:23:51.275255 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:51.275166 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:23:51.275255 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:51.275180 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nk6bf for pod openshift-network-diagnostics/network-check-target-wxgvr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:51.275255 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:51.275234 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf podName:bf24d380-8460-46e7-b05d-3783cbfac351 nodeName:}" failed. No retries permitted until 2026-04-21 04:23:59.275217931 +0000 UTC m=+18.135321891 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-nk6bf" (UniqueName: "kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf") pod "network-check-target-wxgvr" (UID: "bf24d380-8460-46e7-b05d-3783cbfac351") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:51.685925 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:51.684566 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:23:51.685925 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:51.684671 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wxgvr" podUID="bf24d380-8460-46e7-b05d-3783cbfac351" Apr 21 04:23:52.333655 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:52.332474 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-163.ec2.internal" podStartSLOduration=10.33245124 podStartE2EDuration="10.33245124s" podCreationTimestamp="2026-04-21 04:23:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:23:46.761201928 +0000 UTC m=+5.621305908" watchObservedRunningTime="2026-04-21 04:23:52.33245124 +0000 UTC m=+11.192555220" Apr 21 04:23:52.333655 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:52.332882 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-csr9d"] Apr 21 04:23:52.338518 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:52.338094 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:23:52.338518 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:52.338175 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csr9d" podUID="ce0750e5-90f6-418d-9510-f807b89c746c" Apr 21 04:23:52.382187 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:52.382126 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret\") pod \"global-pull-secret-syncer-csr9d\" (UID: \"ce0750e5-90f6-418d-9510-f807b89c746c\") " pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:23:52.382366 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:52.382227 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ce0750e5-90f6-418d-9510-f807b89c746c-dbus\") pod \"global-pull-secret-syncer-csr9d\" (UID: \"ce0750e5-90f6-418d-9510-f807b89c746c\") " pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:23:52.383234 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:52.383041 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ce0750e5-90f6-418d-9510-f807b89c746c-kubelet-config\") pod \"global-pull-secret-syncer-csr9d\" (UID: \"ce0750e5-90f6-418d-9510-f807b89c746c\") " pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:23:52.483659 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:52.483470 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ce0750e5-90f6-418d-9510-f807b89c746c-kubelet-config\") pod \"global-pull-secret-syncer-csr9d\" (UID: \"ce0750e5-90f6-418d-9510-f807b89c746c\") " pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:23:52.483659 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:52.483577 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret\") pod \"global-pull-secret-syncer-csr9d\" (UID: \"ce0750e5-90f6-418d-9510-f807b89c746c\") " pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:23:52.483659 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:52.483604 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ce0750e5-90f6-418d-9510-f807b89c746c-kubelet-config\") pod \"global-pull-secret-syncer-csr9d\" (UID: \"ce0750e5-90f6-418d-9510-f807b89c746c\") " pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:23:52.483659 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:52.483610 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ce0750e5-90f6-418d-9510-f807b89c746c-dbus\") pod \"global-pull-secret-syncer-csr9d\" (UID: \"ce0750e5-90f6-418d-9510-f807b89c746c\") " pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:23:52.484000 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:52.483741 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:23:52.484000 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:52.483800 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret podName:ce0750e5-90f6-418d-9510-f807b89c746c nodeName:}" failed. No retries permitted until 2026-04-21 04:23:52.98378171 +0000 UTC m=+11.843885681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret") pod "global-pull-secret-syncer-csr9d" (UID: "ce0750e5-90f6-418d-9510-f807b89c746c") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:23:52.484000 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:52.483849 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ce0750e5-90f6-418d-9510-f807b89c746c-dbus\") pod \"global-pull-secret-syncer-csr9d\" (UID: \"ce0750e5-90f6-418d-9510-f807b89c746c\") " pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:23:52.683319 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:52.683238 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:52.683499 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:52.683380 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:23:52.986806 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:52.986711 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret\") pod \"global-pull-secret-syncer-csr9d\" (UID: \"ce0750e5-90f6-418d-9510-f807b89c746c\") " pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:23:52.987246 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:52.986864 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:23:52.987246 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:52.986940 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret podName:ce0750e5-90f6-418d-9510-f807b89c746c nodeName:}" failed. No retries permitted until 2026-04-21 04:23:53.98691972 +0000 UTC m=+12.847023685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret") pod "global-pull-secret-syncer-csr9d" (UID: "ce0750e5-90f6-418d-9510-f807b89c746c") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:23:53.683217 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:53.683176 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:23:53.683365 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:53.683300 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wxgvr" podUID="bf24d380-8460-46e7-b05d-3783cbfac351" Apr 21 04:23:53.683365 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:53.683338 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:23:53.683492 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:53.683421 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csr9d" podUID="ce0750e5-90f6-418d-9510-f807b89c746c" Apr 21 04:23:53.993750 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:53.993706 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret\") pod \"global-pull-secret-syncer-csr9d\" (UID: \"ce0750e5-90f6-418d-9510-f807b89c746c\") " pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:23:53.994222 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:53.993843 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:23:53.994222 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:53.993921 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret podName:ce0750e5-90f6-418d-9510-f807b89c746c nodeName:}" failed. No retries permitted until 2026-04-21 04:23:55.99390241 +0000 UTC m=+14.854006370 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret") pod "global-pull-secret-syncer-csr9d" (UID: "ce0750e5-90f6-418d-9510-f807b89c746c") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:23:54.683573 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:54.683534 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:54.683816 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:54.683695 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:23:55.682909 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:55.682873 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:23:55.683368 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:55.682873 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:23:55.683368 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:55.683024 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csr9d" podUID="ce0750e5-90f6-418d-9510-f807b89c746c" Apr 21 04:23:55.683368 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:55.683071 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wxgvr" podUID="bf24d380-8460-46e7-b05d-3783cbfac351" Apr 21 04:23:56.008556 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:56.008464 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret\") pod \"global-pull-secret-syncer-csr9d\" (UID: \"ce0750e5-90f6-418d-9510-f807b89c746c\") " pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:23:56.008752 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:56.008643 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:23:56.008752 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:56.008731 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret podName:ce0750e5-90f6-418d-9510-f807b89c746c nodeName:}" failed. No retries permitted until 2026-04-21 04:24:00.008700865 +0000 UTC m=+18.868804833 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret") pod "global-pull-secret-syncer-csr9d" (UID: "ce0750e5-90f6-418d-9510-f807b89c746c") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:23:56.683134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:56.683101 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:56.683581 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:56.683242 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:23:57.683650 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:57.683603 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:23:57.684188 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:57.683757 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csr9d" podUID="ce0750e5-90f6-418d-9510-f807b89c746c" Apr 21 04:23:57.684188 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:57.683798 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:23:57.684188 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:57.683914 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wxgvr" podUID="bf24d380-8460-46e7-b05d-3783cbfac351" Apr 21 04:23:58.683244 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:58.683168 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:58.683445 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:58.683300 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:23:59.330824 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:59.330785 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk6bf\" (UniqueName: \"kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf\") pod \"network-check-target-wxgvr\" (UID: \"bf24d380-8460-46e7-b05d-3783cbfac351\") " pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:23:59.331276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:59.330853 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs\") pod \"network-metrics-daemon-bwjm7\" (UID: \"c904be64-c3a8-4c8b-97c6-75f15dbf9670\") " pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:23:59.331276 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:59.330956 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:23:59.331276 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:59.330982 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:23:59.331276 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:59.330995 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nk6bf for pod openshift-network-diagnostics/network-check-target-wxgvr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:59.331276 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:59.331056 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf podName:bf24d380-8460-46e7-b05d-3783cbfac351 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:15.331038439 +0000 UTC m=+34.191142400 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-nk6bf" (UniqueName: "kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf") pod "network-check-target-wxgvr" (UID: "bf24d380-8460-46e7-b05d-3783cbfac351") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:59.331276 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:59.330960 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:59.331276 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:59.331120 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs podName:c904be64-c3a8-4c8b-97c6-75f15dbf9670 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:15.331102959 +0000 UTC m=+34.191206916 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs") pod "network-metrics-daemon-bwjm7" (UID: "c904be64-c3a8-4c8b-97c6-75f15dbf9670") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:59.682716 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:59.682636 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:23:59.682892 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:23:59.682636 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:23:59.682892 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:59.682791 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wxgvr" podUID="bf24d380-8460-46e7-b05d-3783cbfac351" Apr 21 04:23:59.682892 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:23:59.682852 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csr9d" podUID="ce0750e5-90f6-418d-9510-f807b89c746c" Apr 21 04:24:00.036768 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:00.036675 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret\") pod \"global-pull-secret-syncer-csr9d\" (UID: \"ce0750e5-90f6-418d-9510-f807b89c746c\") " pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:24:00.036951 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:00.036812 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:24:00.036951 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:00.036880 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret podName:ce0750e5-90f6-418d-9510-f807b89c746c nodeName:}" failed. No retries permitted until 2026-04-21 04:24:08.036865224 +0000 UTC m=+26.896969181 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret") pod "global-pull-secret-syncer-csr9d" (UID: "ce0750e5-90f6-418d-9510-f807b89c746c") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:24:00.683588 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:00.683544 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:24:00.684269 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:00.683702 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:24:01.684534 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:01.684496 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:24:01.684913 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:01.684615 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csr9d" podUID="ce0750e5-90f6-418d-9510-f807b89c746c" Apr 21 04:24:01.684913 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:01.684663 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:24:01.684913 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:01.684763 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wxgvr" podUID="bf24d380-8460-46e7-b05d-3783cbfac351" Apr 21 04:24:02.683303 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.682987 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:24:02.683466 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:02.683415 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:24:02.781144 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.781117 2574 generic.go:358] "Generic (PLEG): container finished" podID="2b3ed1e6-f33c-4bff-99f0-e47538021110" containerID="41b7614e5b865476f3f876d34b9658f88de85b21b44a3469043374a828d6e0f1" exitCode=0 Apr 21 04:24:02.781837 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.781170 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-85nl4" event={"ID":"2b3ed1e6-f33c-4bff-99f0-e47538021110","Type":"ContainerDied","Data":"41b7614e5b865476f3f876d34b9658f88de85b21b44a3469043374a828d6e0f1"} Apr 21 04:24:02.782518 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.782493 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" event={"ID":"29976138-ab3d-47e3-b7e7-973bdd4b5161","Type":"ContainerStarted","Data":"9ab28e5a63b253bf39ae358ebb2556a8b401b8a4e0794fb67502050d06ecf32d"} Apr 21 04:24:02.784805 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.784789 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/ovn-acl-logging/0.log" Apr 21 04:24:02.785053 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.785037 2574 generic.go:358] "Generic (PLEG): container finished" podID="c6af1b08-59ae-47ba-9588-6c079464ff78" containerID="1ec10386caa7a37fc40b1ac11b0ba35d040b4cbd796e3c7221beade51b58a9d5" exitCode=1 Apr 21 04:24:02.785112 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.785090 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" event={"ID":"c6af1b08-59ae-47ba-9588-6c079464ff78","Type":"ContainerStarted","Data":"1f44435ee2c1d94656115d0045e74c430f1cc9fa09070295d53506f7fdca0a96"} Apr 21 04:24:02.785161 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.785123 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" event={"ID":"c6af1b08-59ae-47ba-9588-6c079464ff78","Type":"ContainerStarted","Data":"506f831f1b241af69eee541cb185dd536f21fbf0eb9e69567bb216c3e0175491"} Apr 21 04:24:02.785161 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.785137 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" event={"ID":"c6af1b08-59ae-47ba-9588-6c079464ff78","Type":"ContainerStarted","Data":"6415e74e61f15d13e00c42ac113d1c82f6a123ad343b420461b64f4847a2bdee"} Apr 21 04:24:02.785161 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.785150 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" event={"ID":"c6af1b08-59ae-47ba-9588-6c079464ff78","Type":"ContainerStarted","Data":"bea77f39328b2803d220b675df36868b40b8b92d4315ebe64ff9a11114f98589"} Apr 21 04:24:02.785257 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.785161 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" event={"ID":"c6af1b08-59ae-47ba-9588-6c079464ff78","Type":"ContainerDied","Data":"1ec10386caa7a37fc40b1ac11b0ba35d040b4cbd796e3c7221beade51b58a9d5"} Apr 21 04:24:02.785257 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.785175 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" event={"ID":"c6af1b08-59ae-47ba-9588-6c079464ff78","Type":"ContainerStarted","Data":"4d9c03db0c4f99fa989a9fb15a2bb5a20214a3a2afc3e995abe7dc7f81bb16bd"} Apr 21 04:24:02.786251 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.786231 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pqxgt" event={"ID":"59fe2a00-5ff4-4e40-b04e-88c5b7b9b743","Type":"ContainerStarted","Data":"8244347b069d3b25a37123bd9d0cc8c1a9a5b772574375741b339b7f8c9d2988"} Apr 21 04:24:02.787410 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.787389 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-77fkm" event={"ID":"a70a5ac4-3cb0-4b1d-b4c0-243990f39d76","Type":"ContainerStarted","Data":"bf79e41b5c6325bc29fa6bff31c545cd7f7d54658e30857d9d2f1858bdf6e396"} Apr 21 04:24:02.788521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.788501 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7b454" event={"ID":"9d7bcf82-4223-4a89-a957-2d50a0af065e","Type":"ContainerStarted","Data":"979885349f59152f7ba45966d572ce1e475feff4fe016ad5761d49e0b65c6a87"} Apr 21 04:24:02.789589 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.789563 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4zx8l" event={"ID":"1042efe6-4e47-4044-89ad-3285ece081ef","Type":"ContainerStarted","Data":"5c8098c74063c0c68b9e35979354f8b5d26e559bcb165998832ad520a05e4aaa"} Apr 21 04:24:02.790604 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.790589 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" event={"ID":"b5e5ea7e-ba73-49e1-947a-a711cd272c01","Type":"ContainerStarted","Data":"667ab7cfe7f2cda113743f3dbe7a9b127efead24855bac899e2c9e087529f8dc"} Apr 21 04:24:02.845454 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.845413 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xlmcp" podStartSLOduration=4.568367169 podStartE2EDuration="21.84539996s" podCreationTimestamp="2026-04-21 04:23:41 +0000 UTC" firstStartedPulling="2026-04-21 04:23:44.537919639 +0000 UTC m=+3.398023597" lastFinishedPulling="2026-04-21 04:24:01.814952411 +0000 UTC m=+20.675056388" observedRunningTime="2026-04-21 04:24:02.845381152 +0000 UTC m=+21.705485131" watchObservedRunningTime="2026-04-21 04:24:02.84539996 +0000 UTC m=+21.705503938" Apr 21 04:24:02.845593 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.845524 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pqxgt" podStartSLOduration=4.670653991 podStartE2EDuration="21.845520328s" podCreationTimestamp="2026-04-21 04:23:41 +0000 UTC" firstStartedPulling="2026-04-21 04:23:44.529759889 +0000 UTC m=+3.389863850" lastFinishedPulling="2026-04-21 04:24:01.704626216 +0000 UTC m=+20.564730187" observedRunningTime="2026-04-21 04:24:02.830220597 +0000 UTC m=+21.690324576" watchObservedRunningTime="2026-04-21 04:24:02.845520328 +0000 UTC m=+21.705624352" Apr 21 04:24:02.857464 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.857425 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4zx8l" podStartSLOduration=4.68309 podStartE2EDuration="21.857413169s" podCreationTimestamp="2026-04-21 04:23:41 +0000 UTC" firstStartedPulling="2026-04-21 04:23:44.530360228 +0000 UTC m=+3.390464185" lastFinishedPulling="2026-04-21 04:24:01.704683381 +0000 UTC m=+20.564787354" observedRunningTime="2026-04-21 04:24:02.857065926 +0000 UTC m=+21.717169904" watchObservedRunningTime="2026-04-21 04:24:02.857413169 +0000 UTC m=+21.717517148" Apr 21 04:24:02.889554 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.889508 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7b454" podStartSLOduration=4.718831742 podStartE2EDuration="21.889493435s" podCreationTimestamp="2026-04-21 04:23:41 +0000 UTC" firstStartedPulling="2026-04-21 04:23:44.533967609 +0000 UTC m=+3.394071572" lastFinishedPulling="2026-04-21 04:24:01.704629295 +0000 UTC m=+20.564733265" observedRunningTime="2026-04-21 04:24:02.868771024 +0000 UTC m=+21.728874999" watchObservedRunningTime="2026-04-21 04:24:02.889493435 +0000 UTC m=+21.749597411" Apr 21 04:24:02.889708 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:02.889608 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-77fkm" podStartSLOduration=4.675652142 podStartE2EDuration="21.889603918s" podCreationTimestamp="2026-04-21 04:23:41 +0000 UTC" firstStartedPulling="2026-04-21 04:23:44.527209927 +0000 UTC m=+3.387313887" lastFinishedPulling="2026-04-21 04:24:01.741161703 +0000 UTC m=+20.601265663" observedRunningTime="2026-04-21 04:24:02.889302064 +0000 UTC m=+21.749406057" watchObservedRunningTime="2026-04-21 04:24:02.889603918 +0000 UTC m=+21.749707897" Apr 21 04:24:03.209108 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:03.209083 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 04:24:03.653825 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:03.653689 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T04:24:03.209101456Z","UUID":"ee2591b0-3972-4288-8132-24ad2443db76","Handler":null,"Name":"","Endpoint":""} Apr 21 04:24:03.656803 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:03.656778 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 04:24:03.656803 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:03.656810 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 04:24:03.683860 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:03.683438 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:24:03.683860 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:03.683568 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csr9d" podUID="ce0750e5-90f6-418d-9510-f807b89c746c" Apr 21 04:24:03.683860 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:03.683624 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:24:03.683860 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:03.683752 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wxgvr" podUID="bf24d380-8460-46e7-b05d-3783cbfac351" Apr 21 04:24:03.794468 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:03.794432 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" event={"ID":"b5e5ea7e-ba73-49e1-947a-a711cd272c01","Type":"ContainerStarted","Data":"9eb129103a24c82cf0849dc4b320248b96690024fcc7cee7e66dc9b5b6c24453"} Apr 21 04:24:03.795883 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:03.795808 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zc2r8" event={"ID":"0a3bcd90-ef3a-4aec-bb35-a5dfcb101f0d","Type":"ContainerStarted","Data":"192679a4b4d7d4534aae4027aa971c973fb999454272eb3a8d2ed33a221a1503"} Apr 21 04:24:03.819097 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:03.819037 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zc2r8" podStartSLOduration=4.618586226 podStartE2EDuration="21.819020407s" podCreationTimestamp="2026-04-21 04:23:42 +0000 UTC" firstStartedPulling="2026-04-21 04:23:44.532300349 +0000 UTC m=+3.392404311" lastFinishedPulling="2026-04-21 04:24:01.732734521 +0000 UTC m=+20.592838492" observedRunningTime="2026-04-21 04:24:03.818675033 +0000 UTC m=+22.678779013" watchObservedRunningTime="2026-04-21 04:24:03.819020407 +0000 UTC m=+22.679124437" Apr 21 04:24:04.683528 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:04.683372 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:24:04.683639 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:04.683621 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:24:04.801573 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:04.801544 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/ovn-acl-logging/0.log" Apr 21 04:24:04.801994 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:04.801964 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" event={"ID":"c6af1b08-59ae-47ba-9588-6c079464ff78","Type":"ContainerStarted","Data":"c6276510041db513442565ef49db58ce592f748dfb731ab1f0c4a93cc37673c0"} Apr 21 04:24:04.803985 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:04.803947 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" event={"ID":"b5e5ea7e-ba73-49e1-947a-a711cd272c01","Type":"ContainerStarted","Data":"b9a92a0bed7d07fe8b2667715d18bea732e95820961def2fa7110c715e21fd0d"} Apr 21 04:24:04.820955 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:04.820901 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xsbnp" podStartSLOduration=4.147250062 podStartE2EDuration="23.820884135s" podCreationTimestamp="2026-04-21 04:23:41 +0000 UTC" firstStartedPulling="2026-04-21 04:23:44.540797954 +0000 UTC m=+3.400901913" lastFinishedPulling="2026-04-21 04:24:04.214432014 +0000 UTC m=+23.074535986" observedRunningTime="2026-04-21 04:24:04.820792673 +0000 UTC m=+23.680896653" watchObservedRunningTime="2026-04-21 04:24:04.820884135 +0000 UTC m=+23.680988112" Apr 21 04:24:05.683142 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:05.683109 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:24:05.683356 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:05.683109 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:24:05.683356 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:05.683244 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csr9d" podUID="ce0750e5-90f6-418d-9510-f807b89c746c" Apr 21 04:24:05.683356 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:05.683290 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wxgvr" podUID="bf24d380-8460-46e7-b05d-3783cbfac351" Apr 21 04:24:05.789652 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:05.789613 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4zx8l" Apr 21 04:24:05.790261 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:05.790240 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4zx8l" Apr 21 04:24:06.683091 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:06.683064 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:24:06.683662 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:06.683179 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:24:07.683119 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:07.682888 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:24:07.683638 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:07.682905 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:24:07.683638 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:07.683156 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wxgvr" podUID="bf24d380-8460-46e7-b05d-3783cbfac351" Apr 21 04:24:07.683638 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:07.683214 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csr9d" podUID="ce0750e5-90f6-418d-9510-f807b89c746c" Apr 21 04:24:07.811001 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:07.810971 2574 generic.go:358] "Generic (PLEG): container finished" podID="2b3ed1e6-f33c-4bff-99f0-e47538021110" containerID="72895e86bc8bc6b09e3836feba15421bfbb20f0b08b4979d50da2f5868c24eed" exitCode=0 Apr 21 04:24:07.811148 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:07.811047 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-85nl4" event={"ID":"2b3ed1e6-f33c-4bff-99f0-e47538021110","Type":"ContainerDied","Data":"72895e86bc8bc6b09e3836feba15421bfbb20f0b08b4979d50da2f5868c24eed"} Apr 21 04:24:07.814163 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:07.814144 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/ovn-acl-logging/0.log" Apr 21 04:24:07.814437 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:07.814417 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" event={"ID":"c6af1b08-59ae-47ba-9588-6c079464ff78","Type":"ContainerStarted","Data":"b956b28155f07f0bbfaebf1a4efa7beedb6b4c3f30542df8d8d6339befa6bbd3"} Apr 21 04:24:07.814713 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:07.814697 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:24:07.814774 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:07.814737 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:24:07.814844 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:07.814832 2574 scope.go:117] "RemoveContainer" containerID="1ec10386caa7a37fc40b1ac11b0ba35d040b4cbd796e3c7221beade51b58a9d5" Apr 21 04:24:07.830220 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:07.830203 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:24:08.104127 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:08.104087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret\") pod \"global-pull-secret-syncer-csr9d\" (UID: \"ce0750e5-90f6-418d-9510-f807b89c746c\") " pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:24:08.104268 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:08.104249 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:24:08.104347 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:08.104337 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret podName:ce0750e5-90f6-418d-9510-f807b89c746c nodeName:}" failed. No retries permitted until 2026-04-21 04:24:24.104317911 +0000 UTC m=+42.964421888 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret") pod "global-pull-secret-syncer-csr9d" (UID: "ce0750e5-90f6-418d-9510-f807b89c746c") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:24:08.683063 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:08.683028 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:24:08.683297 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:08.683151 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:24:08.822033 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:08.822002 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-85nl4" event={"ID":"2b3ed1e6-f33c-4bff-99f0-e47538021110","Type":"ContainerStarted","Data":"5b90659b4f0973647ad2f5fda8a540a117b4ad4b928a121b9afc09c7e3c42c31"} Apr 21 04:24:08.826016 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:08.825994 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/ovn-acl-logging/0.log" Apr 21 04:24:08.826416 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:08.826387 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" event={"ID":"c6af1b08-59ae-47ba-9588-6c079464ff78","Type":"ContainerStarted","Data":"7b9db098535aa5b71c82800d5b9e4d5e31f1912235b9114ad3d1b941656fed4f"} Apr 21 04:24:08.826908 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:08.826868 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:24:08.835243 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:08.834761 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4zx8l" Apr 21 04:24:08.835557 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:08.835537 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 04:24:08.836077 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:08.836056 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4zx8l" Apr 21 04:24:08.843332 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:08.843308 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:24:08.857331 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:08.857281 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" podStartSLOduration=9.614328502 podStartE2EDuration="26.857259334s" podCreationTimestamp="2026-04-21 04:23:42 +0000 UTC" firstStartedPulling="2026-04-21 04:23:44.536575443 +0000 UTC m=+3.396679415" lastFinishedPulling="2026-04-21 04:24:01.779506275 +0000 UTC m=+20.639610247" observedRunningTime="2026-04-21 04:24:08.857104704 +0000 UTC m=+27.717208723" watchObservedRunningTime="2026-04-21 04:24:08.857259334 +0000 UTC m=+27.717363312" Apr 21 04:24:09.088306 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:09.088276 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-csr9d"] Apr 21 04:24:09.088513 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:09.088391 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:24:09.088513 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:09.088466 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csr9d" podUID="ce0750e5-90f6-418d-9510-f807b89c746c" Apr 21 04:24:09.092516 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:09.092491 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wxgvr"] Apr 21 04:24:09.092640 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:09.092597 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:24:09.092713 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:09.092692 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wxgvr" podUID="bf24d380-8460-46e7-b05d-3783cbfac351" Apr 21 04:24:09.103136 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:09.103109 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bwjm7"] Apr 21 04:24:09.103238 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:09.103183 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:24:09.103286 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:09.103269 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:24:09.830067 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:09.830031 2574 generic.go:358] "Generic (PLEG): container finished" podID="2b3ed1e6-f33c-4bff-99f0-e47538021110" containerID="5b90659b4f0973647ad2f5fda8a540a117b4ad4b928a121b9afc09c7e3c42c31" exitCode=0 Apr 21 04:24:09.830463 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:09.830114 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-85nl4" event={"ID":"2b3ed1e6-f33c-4bff-99f0-e47538021110","Type":"ContainerDied","Data":"5b90659b4f0973647ad2f5fda8a540a117b4ad4b928a121b9afc09c7e3c42c31"} Apr 21 04:24:10.683078 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:10.683044 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:24:10.683238 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:10.683052 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:24:10.683238 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:10.683185 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:24:10.683238 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:10.683044 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:24:10.683238 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:10.683224 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csr9d" podUID="ce0750e5-90f6-418d-9510-f807b89c746c" Apr 21 04:24:10.683440 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:10.683302 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wxgvr" podUID="bf24d380-8460-46e7-b05d-3783cbfac351" Apr 21 04:24:10.834404 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:10.834375 2574 generic.go:358] "Generic (PLEG): container finished" podID="2b3ed1e6-f33c-4bff-99f0-e47538021110" containerID="3c55a940aab331fbfe252b30b7f1e8b30d35ba6b9d56871b9d2d6fba26f45ca3" exitCode=0 Apr 21 04:24:10.834823 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:10.834458 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-85nl4" event={"ID":"2b3ed1e6-f33c-4bff-99f0-e47538021110","Type":"ContainerDied","Data":"3c55a940aab331fbfe252b30b7f1e8b30d35ba6b9d56871b9d2d6fba26f45ca3"} Apr 21 04:24:12.683564 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:12.683377 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:24:12.684016 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:12.683377 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:24:12.684016 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:12.683377 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:24:12.684016 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:12.683690 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:24:12.684016 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:12.683816 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wxgvr" podUID="bf24d380-8460-46e7-b05d-3783cbfac351" Apr 21 04:24:12.684016 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:12.683897 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csr9d" podUID="ce0750e5-90f6-418d-9510-f807b89c746c" Apr 21 04:24:14.683324 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:14.683280 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:24:14.683940 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:14.683280 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:24:14.683940 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:14.683422 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wxgvr" podUID="bf24d380-8460-46e7-b05d-3783cbfac351" Apr 21 04:24:14.683940 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:14.683280 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:24:14.683940 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:14.683541 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:24:14.683940 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:14.683641 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-csr9d" podUID="ce0750e5-90f6-418d-9510-f807b89c746c" Apr 21 04:24:14.999501 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:14.999421 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-163.ec2.internal" event="NodeReady" Apr 21 04:24:14.999689 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:14.999582 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 04:24:15.034235 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.034155 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-985mb"] Apr 21 04:24:15.038691 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.038666 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-55fbc49767-s2v7s"] Apr 21 04:24:15.038880 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.038861 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" Apr 21 04:24:15.041175 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.041148 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 04:24:15.041289 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.041188 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 04:24:15.041289 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.041200 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-tv2rf\"" Apr 21 04:24:15.042078 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.042060 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.044369 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.044293 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 04:24:15.044558 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.044538 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-985mb"] Apr 21 04:24:15.044625 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.044560 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 04:24:15.044625 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.044549 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-v6cxl\"" Apr 21 04:24:15.044625 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.044557 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 04:24:15.048338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.048196 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tz8h8"] Apr 21 04:24:15.055110 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.055082 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rbw56"] Apr 21 04:24:15.055255 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.055090 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 04:24:15.055255 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.055205 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:15.057711 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.057537 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 04:24:15.057817 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.057799 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 04:24:15.057892 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.057876 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vk74d\"" Apr 21 04:24:15.058570 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.058550 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rbw56" Apr 21 04:24:15.062223 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.061308 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 04:24:15.062223 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.061546 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 04:24:15.062223 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.061797 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l78gs\"" Apr 21 04:24:15.062223 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.062026 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 04:24:15.062223 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.062168 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rbw56"] Apr 21 04:24:15.062223 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.062191 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55fbc49767-s2v7s"] Apr 21 04:24:15.063275 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.063257 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tz8h8"] Apr 21 04:24:15.158122 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.158081 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/668d3b91-6679-4a3c-af26-520078ad97bd-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-985mb\" (UID: \"668d3b91-6679-4a3c-af26-520078ad97bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" Apr 21 04:24:15.158122 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.158138 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3910765c-b3b6-4194-b81f-e0ede65ac33b-ca-trust-extracted\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.158371 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.158193 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:15.158371 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.158219 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-tmp-dir\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:15.158371 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.158264 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert\") pod \"ingress-canary-rbw56\" (UID: \"65dc137e-873c-4d24-b876-d35819405cac\") " pod="openshift-ingress-canary/ingress-canary-rbw56" Apr 21 04:24:15.158371 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.158288 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.158371 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.158313 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rxln\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-kube-api-access-9rxln\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.158371 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.158351 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-config-volume\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:15.158644 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.158374 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgm8c\" (UniqueName: \"kubernetes.io/projected/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-kube-api-access-lgm8c\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:15.158644 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.158403 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-bound-sa-token\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.158644 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.158456 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfz45\" (UniqueName: \"kubernetes.io/projected/65dc137e-873c-4d24-b876-d35819405cac-kube-api-access-gfz45\") pod \"ingress-canary-rbw56\" (UID: \"65dc137e-873c-4d24-b876-d35819405cac\") " pod="openshift-ingress-canary/ingress-canary-rbw56" Apr 21 04:24:15.158644 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.158483 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3910765c-b3b6-4194-b81f-e0ede65ac33b-trusted-ca\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.158644 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.158511 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3910765c-b3b6-4194-b81f-e0ede65ac33b-image-registry-private-configuration\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.158644 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.158535 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3910765c-b3b6-4194-b81f-e0ede65ac33b-installation-pull-secrets\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.158644 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.158607 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-985mb\" (UID: \"668d3b91-6679-4a3c-af26-520078ad97bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" Apr 21 04:24:15.158949 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.158653 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-certificates\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.259554 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.259465 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-985mb\" (UID: \"668d3b91-6679-4a3c-af26-520078ad97bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" Apr 21 04:24:15.259554 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.259512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-certificates\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.259554 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.259542 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/668d3b91-6679-4a3c-af26-520078ad97bd-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-985mb\" (UID: \"668d3b91-6679-4a3c-af26-520078ad97bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" Apr 21 04:24:15.259554 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.259560 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3910765c-b3b6-4194-b81f-e0ede65ac33b-ca-trust-extracted\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.259925 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.259614 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:15.259925 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.259629 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 04:24:15.259925 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.259719 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert podName:668d3b91-6679-4a3c-af26-520078ad97bd nodeName:}" failed. No retries permitted until 2026-04-21 04:24:15.759697693 +0000 UTC m=+34.619801650 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-985mb" (UID: "668d3b91-6679-4a3c-af26-520078ad97bd") : secret "networking-console-plugin-cert" not found Apr 21 04:24:15.259925 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.259718 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:15.259925 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.259835 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls podName:9891f7eb-3b3e-4b55-8ce3-97281e95db7e nodeName:}" failed. No retries permitted until 2026-04-21 04:24:15.759816365 +0000 UTC m=+34.619920324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls") pod "dns-default-tz8h8" (UID: "9891f7eb-3b3e-4b55-8ce3-97281e95db7e") : secret "dns-default-metrics-tls" not found Apr 21 04:24:15.259925 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.259640 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-tmp-dir\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:15.259925 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.259888 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert\") pod \"ingress-canary-rbw56\" (UID: \"65dc137e-873c-4d24-b876-d35819405cac\") " pod="openshift-ingress-canary/ingress-canary-rbw56" Apr 21 04:24:15.259925 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.259911 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.259925 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.259927 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rxln\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-kube-api-access-9rxln\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.260392 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.259958 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-config-volume\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:15.260392 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.259983 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgm8c\" (UniqueName: \"kubernetes.io/projected/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-kube-api-access-lgm8c\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:15.260392 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.260013 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:15.260392 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.260011 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-bound-sa-token\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.260392 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.260051 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert podName:65dc137e-873c-4d24-b876-d35819405cac nodeName:}" failed. No retries permitted until 2026-04-21 04:24:15.760039518 +0000 UTC m=+34.620143485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert") pod "ingress-canary-rbw56" (UID: "65dc137e-873c-4d24-b876-d35819405cac") : secret "canary-serving-cert" not found Apr 21 04:24:15.260392 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.260068 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-tmp-dir\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:15.260392 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.260102 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfz45\" (UniqueName: \"kubernetes.io/projected/65dc137e-873c-4d24-b876-d35819405cac-kube-api-access-gfz45\") pod \"ingress-canary-rbw56\" (UID: \"65dc137e-873c-4d24-b876-d35819405cac\") " pod="openshift-ingress-canary/ingress-canary-rbw56" Apr 21 04:24:15.260392 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.260120 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3910765c-b3b6-4194-b81f-e0ede65ac33b-ca-trust-extracted\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.260392 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.260133 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3910765c-b3b6-4194-b81f-e0ede65ac33b-trusted-ca\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.260392 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.260165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3910765c-b3b6-4194-b81f-e0ede65ac33b-image-registry-private-configuration\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.260392 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.260195 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3910765c-b3b6-4194-b81f-e0ede65ac33b-installation-pull-secrets\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.260392 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.260215 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:24:15.260392 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.260228 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55fbc49767-s2v7s: secret "image-registry-tls" not found Apr 21 04:24:15.260967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.260437 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-certificates\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.260967 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.260479 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls podName:3910765c-b3b6-4194-b81f-e0ede65ac33b nodeName:}" failed. No retries permitted until 2026-04-21 04:24:15.760465143 +0000 UTC m=+34.620569100 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls") pod "image-registry-55fbc49767-s2v7s" (UID: "3910765c-b3b6-4194-b81f-e0ede65ac33b") : secret "image-registry-tls" not found Apr 21 04:24:15.260967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.260549 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-config-volume\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:15.260967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.260606 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/668d3b91-6679-4a3c-af26-520078ad97bd-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-985mb\" (UID: \"668d3b91-6679-4a3c-af26-520078ad97bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" Apr 21 04:24:15.261108 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.261032 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3910765c-b3b6-4194-b81f-e0ede65ac33b-trusted-ca\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.265500 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.265474 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3910765c-b3b6-4194-b81f-e0ede65ac33b-installation-pull-secrets\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.265500 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.265484 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3910765c-b3b6-4194-b81f-e0ede65ac33b-image-registry-private-configuration\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.268492 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.268465 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-bound-sa-token\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.268608 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.268565 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfz45\" (UniqueName: \"kubernetes.io/projected/65dc137e-873c-4d24-b876-d35819405cac-kube-api-access-gfz45\") pod \"ingress-canary-rbw56\" (UID: \"65dc137e-873c-4d24-b876-d35819405cac\") " pod="openshift-ingress-canary/ingress-canary-rbw56" Apr 21 04:24:15.268791 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.268770 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rxln\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-kube-api-access-9rxln\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.268899 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.268845 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgm8c\" (UniqueName: \"kubernetes.io/projected/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-kube-api-access-lgm8c\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:15.361131 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.361083 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk6bf\" (UniqueName: \"kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf\") pod \"network-check-target-wxgvr\" (UID: \"bf24d380-8460-46e7-b05d-3783cbfac351\") " pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:24:15.361316 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.361191 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs\") pod \"network-metrics-daemon-bwjm7\" (UID: \"c904be64-c3a8-4c8b-97c6-75f15dbf9670\") " pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:24:15.361316 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.361266 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:24:15.361316 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.361296 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:24:15.361316 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.361311 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nk6bf for pod openshift-network-diagnostics/network-check-target-wxgvr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:15.361316 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.361317 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:15.361564 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.361381 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf podName:bf24d380-8460-46e7-b05d-3783cbfac351 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:47.361361048 +0000 UTC m=+66.221465019 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-nk6bf" (UniqueName: "kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf") pod "network-check-target-wxgvr" (UID: "bf24d380-8460-46e7-b05d-3783cbfac351") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:15.361564 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.361409 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs podName:c904be64-c3a8-4c8b-97c6-75f15dbf9670 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:47.361399157 +0000 UTC m=+66.221503154 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs") pod "network-metrics-daemon-bwjm7" (UID: "c904be64-c3a8-4c8b-97c6-75f15dbf9670") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:15.763596 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.763562 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-985mb\" (UID: \"668d3b91-6679-4a3c-af26-520078ad97bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" Apr 21 04:24:15.764255 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.763632 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:15.764255 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.763666 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert\") pod \"ingress-canary-rbw56\" (UID: \"65dc137e-873c-4d24-b876-d35819405cac\") " pod="openshift-ingress-canary/ingress-canary-rbw56" Apr 21 04:24:15.764255 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:15.763692 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:15.764255 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.763740 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 04:24:15.764255 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.763805 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:15.764255 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.763808 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert podName:668d3b91-6679-4a3c-af26-520078ad97bd nodeName:}" failed. No retries permitted until 2026-04-21 04:24:16.763789094 +0000 UTC m=+35.623893059 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-985mb" (UID: "668d3b91-6679-4a3c-af26-520078ad97bd") : secret "networking-console-plugin-cert" not found Apr 21 04:24:15.764255 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.763806 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:15.764255 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.763865 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls podName:9891f7eb-3b3e-4b55-8ce3-97281e95db7e nodeName:}" failed. No retries permitted until 2026-04-21 04:24:16.76385162 +0000 UTC m=+35.623955581 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls") pod "dns-default-tz8h8" (UID: "9891f7eb-3b3e-4b55-8ce3-97281e95db7e") : secret "dns-default-metrics-tls" not found Apr 21 04:24:15.764255 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.763809 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:24:15.764255 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.763895 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert podName:65dc137e-873c-4d24-b876-d35819405cac nodeName:}" failed. No retries permitted until 2026-04-21 04:24:16.763882697 +0000 UTC m=+35.623986669 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert") pod "ingress-canary-rbw56" (UID: "65dc137e-873c-4d24-b876-d35819405cac") : secret "canary-serving-cert" not found Apr 21 04:24:15.764255 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.763902 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55fbc49767-s2v7s: secret "image-registry-tls" not found Apr 21 04:24:15.764255 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:15.763949 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls podName:3910765c-b3b6-4194-b81f-e0ede65ac33b nodeName:}" failed. No retries permitted until 2026-04-21 04:24:16.763931865 +0000 UTC m=+35.624035836 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls") pod "image-registry-55fbc49767-s2v7s" (UID: "3910765c-b3b6-4194-b81f-e0ede65ac33b") : secret "image-registry-tls" not found Apr 21 04:24:16.683696 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:16.683655 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:24:16.683696 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:16.683685 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:24:16.683696 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:16.683701 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:24:16.686749 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:16.686712 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 04:24:16.687580 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:16.687562 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 04:24:16.687875 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:16.687857 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 04:24:16.687875 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:16.687870 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tdp6m\"" Apr 21 04:24:16.688031 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:16.687860 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9c9js\"" Apr 21 04:24:16.688031 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:16.687902 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 04:24:16.772798 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:16.772772 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-985mb\" (UID: \"668d3b91-6679-4a3c-af26-520078ad97bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" Apr 21 04:24:16.773250 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:16.772827 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:16.773250 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:16.772856 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert\") pod \"ingress-canary-rbw56\" (UID: \"65dc137e-873c-4d24-b876-d35819405cac\") " pod="openshift-ingress-canary/ingress-canary-rbw56" Apr 21 04:24:16.773250 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:16.772874 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:16.773250 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:16.772936 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 04:24:16.773250 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:16.772995 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:16.773250 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:16.773015 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert podName:668d3b91-6679-4a3c-af26-520078ad97bd nodeName:}" failed. No retries permitted until 2026-04-21 04:24:18.772996848 +0000 UTC m=+37.633100807 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-985mb" (UID: "668d3b91-6679-4a3c-af26-520078ad97bd") : secret "networking-console-plugin-cert" not found Apr 21 04:24:16.773250 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:16.773015 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:16.773250 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:16.773048 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls podName:9891f7eb-3b3e-4b55-8ce3-97281e95db7e nodeName:}" failed. No retries permitted until 2026-04-21 04:24:18.773036779 +0000 UTC m=+37.633140739 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls") pod "dns-default-tz8h8" (UID: "9891f7eb-3b3e-4b55-8ce3-97281e95db7e") : secret "dns-default-metrics-tls" not found Apr 21 04:24:16.773250 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:16.772998 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:24:16.773250 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:16.773065 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55fbc49767-s2v7s: secret "image-registry-tls" not found Apr 21 04:24:16.773250 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:16.773071 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert podName:65dc137e-873c-4d24-b876-d35819405cac nodeName:}" failed. No retries permitted until 2026-04-21 04:24:18.773057794 +0000 UTC m=+37.633161753 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert") pod "ingress-canary-rbw56" (UID: "65dc137e-873c-4d24-b876-d35819405cac") : secret "canary-serving-cert" not found Apr 21 04:24:16.773250 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:16.773098 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls podName:3910765c-b3b6-4194-b81f-e0ede65ac33b nodeName:}" failed. No retries permitted until 2026-04-21 04:24:18.773089411 +0000 UTC m=+37.633193370 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls") pod "image-registry-55fbc49767-s2v7s" (UID: "3910765c-b3b6-4194-b81f-e0ede65ac33b") : secret "image-registry-tls" not found Apr 21 04:24:17.833101 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.833073 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74779f6785-km27b"] Apr 21 04:24:17.850895 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.850865 2574 generic.go:358] "Generic (PLEG): container finished" podID="2b3ed1e6-f33c-4bff-99f0-e47538021110" containerID="23533dca23487bfbcd9f02c4c227e31f88aa363bac4a787fc0404d01be177014" exitCode=0 Apr 21 04:24:17.856698 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.856676 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz"] Apr 21 04:24:17.857027 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.857003 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74779f6785-km27b" Apr 21 04:24:17.859716 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.859684 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 04:24:17.859716 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.859701 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 04:24:17.859888 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.859701 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-7jr5z\"" Apr 21 04:24:17.859888 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.859843 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 04:24:17.859976 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.859934 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 04:24:17.872548 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.872532 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq"] Apr 21 04:24:17.872616 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.872553 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" Apr 21 04:24:17.874749 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.874713 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 04:24:17.890899 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.890877 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-85nl4" event={"ID":"2b3ed1e6-f33c-4bff-99f0-e47538021110","Type":"ContainerDied","Data":"23533dca23487bfbcd9f02c4c227e31f88aa363bac4a787fc0404d01be177014"} Apr 21 04:24:17.890986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.890905 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74779f6785-km27b"] Apr 21 04:24:17.890986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.890918 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz"] Apr 21 04:24:17.890986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.890926 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq"] Apr 21 04:24:17.891079 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.891015 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:17.893062 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.893045 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 04:24:17.893303 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.893287 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 04:24:17.893303 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.893296 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 04:24:17.893429 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.893289 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 04:24:17.982468 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.982436 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8db94413-d57f-4629-b3a5-631e0c31a8b6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-74779f6785-km27b\" (UID: \"8db94413-d57f-4629-b3a5-631e0c31a8b6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74779f6785-km27b" Apr 21 04:24:17.982633 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.982575 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grbwg\" (UniqueName: \"kubernetes.io/projected/b328e694-4fc1-4021-88a1-a6a6965396b0-kube-api-access-grbwg\") pod \"klusterlet-addon-workmgr-697999f9f6-4fqmz\" (UID: \"b328e694-4fc1-4021-88a1-a6a6965396b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" Apr 21 04:24:17.982698 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.982656 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5lxv\" (UniqueName: \"kubernetes.io/projected/8db94413-d57f-4629-b3a5-631e0c31a8b6-kube-api-access-f5lxv\") pod \"managed-serviceaccount-addon-agent-74779f6785-km27b\" (UID: \"8db94413-d57f-4629-b3a5-631e0c31a8b6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74779f6785-km27b" Apr 21 04:24:17.982780 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.982765 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b328e694-4fc1-4021-88a1-a6a6965396b0-klusterlet-config\") pod \"klusterlet-addon-workmgr-697999f9f6-4fqmz\" (UID: \"b328e694-4fc1-4021-88a1-a6a6965396b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" Apr 21 04:24:17.982827 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:17.982800 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b328e694-4fc1-4021-88a1-a6a6965396b0-tmp\") pod \"klusterlet-addon-workmgr-697999f9f6-4fqmz\" (UID: \"b328e694-4fc1-4021-88a1-a6a6965396b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" Apr 21 04:24:18.084012 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.083937 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b328e694-4fc1-4021-88a1-a6a6965396b0-klusterlet-config\") pod \"klusterlet-addon-workmgr-697999f9f6-4fqmz\" (UID: \"b328e694-4fc1-4021-88a1-a6a6965396b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" Apr 21 04:24:18.084012 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.083981 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b328e694-4fc1-4021-88a1-a6a6965396b0-tmp\") pod \"klusterlet-addon-workmgr-697999f9f6-4fqmz\" (UID: \"b328e694-4fc1-4021-88a1-a6a6965396b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" Apr 21 04:24:18.084185 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.084104 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ca792d14-29bc-4254-9900-24c8dd60bb22-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.084185 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.084154 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8db94413-d57f-4629-b3a5-631e0c31a8b6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-74779f6785-km27b\" (UID: \"8db94413-d57f-4629-b3a5-631e0c31a8b6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74779f6785-km27b" Apr 21 04:24:18.084290 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.084191 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/ca792d14-29bc-4254-9900-24c8dd60bb22-ca\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.084290 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.084223 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/ca792d14-29bc-4254-9900-24c8dd60bb22-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.084394 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.084289 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l47lf\" (UniqueName: \"kubernetes.io/projected/ca792d14-29bc-4254-9900-24c8dd60bb22-kube-api-access-l47lf\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.084394 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.084319 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b328e694-4fc1-4021-88a1-a6a6965396b0-tmp\") pod \"klusterlet-addon-workmgr-697999f9f6-4fqmz\" (UID: \"b328e694-4fc1-4021-88a1-a6a6965396b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" Apr 21 04:24:18.084394 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.084337 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5lxv\" (UniqueName: \"kubernetes.io/projected/8db94413-d57f-4629-b3a5-631e0c31a8b6-kube-api-access-f5lxv\") pod \"managed-serviceaccount-addon-agent-74779f6785-km27b\" (UID: \"8db94413-d57f-4629-b3a5-631e0c31a8b6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74779f6785-km27b" Apr 21 04:24:18.084394 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.084367 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/ca792d14-29bc-4254-9900-24c8dd60bb22-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.084576 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.084404 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/ca792d14-29bc-4254-9900-24c8dd60bb22-hub\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.084576 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.084479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grbwg\" (UniqueName: \"kubernetes.io/projected/b328e694-4fc1-4021-88a1-a6a6965396b0-kube-api-access-grbwg\") pod \"klusterlet-addon-workmgr-697999f9f6-4fqmz\" (UID: \"b328e694-4fc1-4021-88a1-a6a6965396b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" Apr 21 04:24:18.086538 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.086516 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8db94413-d57f-4629-b3a5-631e0c31a8b6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-74779f6785-km27b\" (UID: \"8db94413-d57f-4629-b3a5-631e0c31a8b6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74779f6785-km27b" Apr 21 04:24:18.088629 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.088610 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b328e694-4fc1-4021-88a1-a6a6965396b0-klusterlet-config\") pod \"klusterlet-addon-workmgr-697999f9f6-4fqmz\" (UID: \"b328e694-4fc1-4021-88a1-a6a6965396b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" Apr 21 04:24:18.091476 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.091457 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5lxv\" (UniqueName: \"kubernetes.io/projected/8db94413-d57f-4629-b3a5-631e0c31a8b6-kube-api-access-f5lxv\") pod \"managed-serviceaccount-addon-agent-74779f6785-km27b\" (UID: \"8db94413-d57f-4629-b3a5-631e0c31a8b6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74779f6785-km27b" Apr 21 04:24:18.092000 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.091983 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grbwg\" (UniqueName: \"kubernetes.io/projected/b328e694-4fc1-4021-88a1-a6a6965396b0-kube-api-access-grbwg\") pod \"klusterlet-addon-workmgr-697999f9f6-4fqmz\" (UID: \"b328e694-4fc1-4021-88a1-a6a6965396b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" Apr 21 04:24:18.178594 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.178556 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74779f6785-km27b" Apr 21 04:24:18.185214 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.184968 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" Apr 21 04:24:18.185342 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.185212 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ca792d14-29bc-4254-9900-24c8dd60bb22-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.185342 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.185249 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/ca792d14-29bc-4254-9900-24c8dd60bb22-ca\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.185342 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.185273 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/ca792d14-29bc-4254-9900-24c8dd60bb22-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.185342 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.185317 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l47lf\" (UniqueName: \"kubernetes.io/projected/ca792d14-29bc-4254-9900-24c8dd60bb22-kube-api-access-l47lf\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.185568 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.185358 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/ca792d14-29bc-4254-9900-24c8dd60bb22-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.185568 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.185391 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/ca792d14-29bc-4254-9900-24c8dd60bb22-hub\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.193682 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.193535 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/ca792d14-29bc-4254-9900-24c8dd60bb22-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.193824 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.193604 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/ca792d14-29bc-4254-9900-24c8dd60bb22-hub\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.193824 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.193627 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/ca792d14-29bc-4254-9900-24c8dd60bb22-ca\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.193824 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.193649 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/ca792d14-29bc-4254-9900-24c8dd60bb22-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.193824 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.193761 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ca792d14-29bc-4254-9900-24c8dd60bb22-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.194223 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.194198 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l47lf\" (UniqueName: \"kubernetes.io/projected/ca792d14-29bc-4254-9900-24c8dd60bb22-kube-api-access-l47lf\") pod \"cluster-proxy-proxy-agent-6549466998-j4mmq\" (UID: \"ca792d14-29bc-4254-9900-24c8dd60bb22\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.200525 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.200504 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:24:18.345009 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.344919 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz"] Apr 21 04:24:18.347290 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.347266 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq"] Apr 21 04:24:18.349029 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:24:18.348995 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb328e694_4fc1_4021_88a1_a6a6965396b0.slice/crio-05aa945a20adf5ea0a711c239094e37da2f557d4b8eb2989129295395bb894e6 WatchSource:0}: Error finding container 05aa945a20adf5ea0a711c239094e37da2f557d4b8eb2989129295395bb894e6: Status 404 returned error can't find the container with id 05aa945a20adf5ea0a711c239094e37da2f557d4b8eb2989129295395bb894e6 Apr 21 04:24:18.349972 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.349950 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74779f6785-km27b"] Apr 21 04:24:18.350316 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:24:18.350297 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca792d14_29bc_4254_9900_24c8dd60bb22.slice/crio-dc8a150a5ff598bfc6ab485b03a5f1b25c39f026fa761fcaa14d3cb1c0fb6d4c WatchSource:0}: Error finding container dc8a150a5ff598bfc6ab485b03a5f1b25c39f026fa761fcaa14d3cb1c0fb6d4c: Status 404 returned error can't find the container with id dc8a150a5ff598bfc6ab485b03a5f1b25c39f026fa761fcaa14d3cb1c0fb6d4c Apr 21 04:24:18.364451 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:24:18.364432 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8db94413_d57f_4629_b3a5_631e0c31a8b6.slice/crio-bf40cf8b04859abb2f205ddb874a39c0ce1e487f2e2955c9098f42dbf7964546 WatchSource:0}: Error finding container bf40cf8b04859abb2f205ddb874a39c0ce1e487f2e2955c9098f42dbf7964546: Status 404 returned error can't find the container with id bf40cf8b04859abb2f205ddb874a39c0ce1e487f2e2955c9098f42dbf7964546 Apr 21 04:24:18.789709 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.789668 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:18.789908 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.789743 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-985mb\" (UID: \"668d3b91-6679-4a3c-af26-520078ad97bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" Apr 21 04:24:18.789908 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:18.789835 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 04:24:18.789908 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:18.789848 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:24:18.789908 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:18.789868 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55fbc49767-s2v7s: secret "image-registry-tls" not found Apr 21 04:24:18.789908 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:18.789885 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert podName:668d3b91-6679-4a3c-af26-520078ad97bd nodeName:}" failed. No retries permitted until 2026-04-21 04:24:22.789871337 +0000 UTC m=+41.649975296 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-985mb" (UID: "668d3b91-6679-4a3c-af26-520078ad97bd") : secret "networking-console-plugin-cert" not found Apr 21 04:24:18.790169 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:18.789916 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls podName:3910765c-b3b6-4194-b81f-e0ede65ac33b nodeName:}" failed. No retries permitted until 2026-04-21 04:24:22.789900334 +0000 UTC m=+41.650004297 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls") pod "image-registry-55fbc49767-s2v7s" (UID: "3910765c-b3b6-4194-b81f-e0ede65ac33b") : secret "image-registry-tls" not found Apr 21 04:24:18.790169 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.789962 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:18.790169 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.789997 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert\") pod \"ingress-canary-rbw56\" (UID: \"65dc137e-873c-4d24-b876-d35819405cac\") " pod="openshift-ingress-canary/ingress-canary-rbw56" Apr 21 04:24:18.790169 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:18.790073 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:18.790169 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:18.790103 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:18.790169 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:18.790127 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls podName:9891f7eb-3b3e-4b55-8ce3-97281e95db7e nodeName:}" failed. No retries permitted until 2026-04-21 04:24:22.790116329 +0000 UTC m=+41.650220287 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls") pod "dns-default-tz8h8" (UID: "9891f7eb-3b3e-4b55-8ce3-97281e95db7e") : secret "dns-default-metrics-tls" not found Apr 21 04:24:18.790169 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:18.790142 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert podName:65dc137e-873c-4d24-b876-d35819405cac nodeName:}" failed. No retries permitted until 2026-04-21 04:24:22.790134923 +0000 UTC m=+41.650238879 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert") pod "ingress-canary-rbw56" (UID: "65dc137e-873c-4d24-b876-d35819405cac") : secret "canary-serving-cert" not found Apr 21 04:24:18.853773 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.853716 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" event={"ID":"ca792d14-29bc-4254-9900-24c8dd60bb22","Type":"ContainerStarted","Data":"dc8a150a5ff598bfc6ab485b03a5f1b25c39f026fa761fcaa14d3cb1c0fb6d4c"} Apr 21 04:24:18.854828 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.854804 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" event={"ID":"b328e694-4fc1-4021-88a1-a6a6965396b0","Type":"ContainerStarted","Data":"05aa945a20adf5ea0a711c239094e37da2f557d4b8eb2989129295395bb894e6"} Apr 21 04:24:18.855705 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.855685 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74779f6785-km27b" event={"ID":"8db94413-d57f-4629-b3a5-631e0c31a8b6","Type":"ContainerStarted","Data":"bf40cf8b04859abb2f205ddb874a39c0ce1e487f2e2955c9098f42dbf7964546"} Apr 21 04:24:18.858442 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.858416 2574 generic.go:358] "Generic (PLEG): container finished" podID="2b3ed1e6-f33c-4bff-99f0-e47538021110" containerID="a4087346474b8680fdf193c603c42291f485e6a10053811c5d29d26f160038e1" exitCode=0 Apr 21 04:24:18.858533 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:18.858467 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-85nl4" event={"ID":"2b3ed1e6-f33c-4bff-99f0-e47538021110","Type":"ContainerDied","Data":"a4087346474b8680fdf193c603c42291f485e6a10053811c5d29d26f160038e1"} Apr 21 04:24:19.874781 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:19.874746 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-85nl4" event={"ID":"2b3ed1e6-f33c-4bff-99f0-e47538021110","Type":"ContainerStarted","Data":"354686f9d3241dab3a3dccb27bef206bd662935daae08358c789a1bd63878ebd"} Apr 21 04:24:21.713815 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:21.713750 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-85nl4" podStartSLOduration=8.518423696 podStartE2EDuration="40.713710362s" podCreationTimestamp="2026-04-21 04:23:41 +0000 UTC" firstStartedPulling="2026-04-21 04:23:44.540123236 +0000 UTC m=+3.400227194" lastFinishedPulling="2026-04-21 04:24:16.735409904 +0000 UTC m=+35.595513860" observedRunningTime="2026-04-21 04:24:19.901129516 +0000 UTC m=+38.761233520" watchObservedRunningTime="2026-04-21 04:24:21.713710362 +0000 UTC m=+40.573814357" Apr 21 04:24:22.826487 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:22.826446 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:22.826487 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:22.826494 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert\") pod \"ingress-canary-rbw56\" (UID: \"65dc137e-873c-4d24-b876-d35819405cac\") " pod="openshift-ingress-canary/ingress-canary-rbw56" Apr 21 04:24:22.827011 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:22.826575 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:22.827011 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:22.826624 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:22.827011 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:22.826681 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:22.827011 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:22.826697 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 04:24:22.827011 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:22.826623 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-985mb\" (UID: \"668d3b91-6679-4a3c-af26-520078ad97bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" Apr 21 04:24:22.827011 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:22.826700 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls podName:9891f7eb-3b3e-4b55-8ce3-97281e95db7e nodeName:}" failed. No retries permitted until 2026-04-21 04:24:30.826678106 +0000 UTC m=+49.686782071 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls") pod "dns-default-tz8h8" (UID: "9891f7eb-3b3e-4b55-8ce3-97281e95db7e") : secret "dns-default-metrics-tls" not found Apr 21 04:24:22.827011 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:22.826761 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:24:22.827011 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:22.826769 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert podName:65dc137e-873c-4d24-b876-d35819405cac nodeName:}" failed. No retries permitted until 2026-04-21 04:24:30.826748414 +0000 UTC m=+49.686852375 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert") pod "ingress-canary-rbw56" (UID: "65dc137e-873c-4d24-b876-d35819405cac") : secret "canary-serving-cert" not found Apr 21 04:24:22.827011 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:22.826778 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55fbc49767-s2v7s: secret "image-registry-tls" not found Apr 21 04:24:22.827011 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:22.826795 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert podName:668d3b91-6679-4a3c-af26-520078ad97bd nodeName:}" failed. No retries permitted until 2026-04-21 04:24:30.82678299 +0000 UTC m=+49.686887169 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-985mb" (UID: "668d3b91-6679-4a3c-af26-520078ad97bd") : secret "networking-console-plugin-cert" not found Apr 21 04:24:22.827011 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:22.826819 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls podName:3910765c-b3b6-4194-b81f-e0ede65ac33b nodeName:}" failed. No retries permitted until 2026-04-21 04:24:30.826807885 +0000 UTC m=+49.686911857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls") pod "image-registry-55fbc49767-s2v7s" (UID: "3910765c-b3b6-4194-b81f-e0ede65ac33b") : secret "image-registry-tls" not found Apr 21 04:24:24.140157 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:24.140114 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret\") pod \"global-pull-secret-syncer-csr9d\" (UID: \"ce0750e5-90f6-418d-9510-f807b89c746c\") " pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:24:24.143604 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:24.143561 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ce0750e5-90f6-418d-9510-f807b89c746c-original-pull-secret\") pod \"global-pull-secret-syncer-csr9d\" (UID: \"ce0750e5-90f6-418d-9510-f807b89c746c\") " pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:24:24.199678 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:24.199643 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-csr9d" Apr 21 04:24:24.721523 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:24.721496 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-csr9d"] Apr 21 04:24:24.724390 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:24:24.724332 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce0750e5_90f6_418d_9510_f807b89c746c.slice/crio-20e3bb1da30bc3d8d7304dede8548245501470c10f8acb0226a812deab5f8d50 WatchSource:0}: Error finding container 20e3bb1da30bc3d8d7304dede8548245501470c10f8acb0226a812deab5f8d50: Status 404 returned error can't find the container with id 20e3bb1da30bc3d8d7304dede8548245501470c10f8acb0226a812deab5f8d50 Apr 21 04:24:24.886173 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:24.886131 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" event={"ID":"b328e694-4fc1-4021-88a1-a6a6965396b0","Type":"ContainerStarted","Data":"cdf4bc0adc7bd909dcc0c43b937c8e2094eece8934c032237c162eb81cb4f42a"} Apr 21 04:24:24.886385 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:24.886359 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" Apr 21 04:24:24.887308 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:24.887280 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-csr9d" event={"ID":"ce0750e5-90f6-418d-9510-f807b89c746c","Type":"ContainerStarted","Data":"20e3bb1da30bc3d8d7304dede8548245501470c10f8acb0226a812deab5f8d50"} Apr 21 04:24:24.888395 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:24.888372 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" Apr 21 04:24:24.888661 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:24.888639 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74779f6785-km27b" event={"ID":"8db94413-d57f-4629-b3a5-631e0c31a8b6","Type":"ContainerStarted","Data":"7c973a0e5da89600d218a039d57da5a17f6532c655cba71a4ce1812f5ddc8ee2"} Apr 21 04:24:24.889881 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:24.889864 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" event={"ID":"ca792d14-29bc-4254-9900-24c8dd60bb22","Type":"ContainerStarted","Data":"82111aa90f387c8571a4134dc5691aa0d35d7b2851c9efd48792380cb03af7fd"} Apr 21 04:24:24.900819 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:24.900701 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" podStartSLOduration=1.659458833 podStartE2EDuration="7.900689397s" podCreationTimestamp="2026-04-21 04:24:17 +0000 UTC" firstStartedPulling="2026-04-21 04:24:18.350832897 +0000 UTC m=+37.210936856" lastFinishedPulling="2026-04-21 04:24:24.592063459 +0000 UTC m=+43.452167420" observedRunningTime="2026-04-21 04:24:24.900087495 +0000 UTC m=+43.760191476" watchObservedRunningTime="2026-04-21 04:24:24.900689397 +0000 UTC m=+43.760793399" Apr 21 04:24:24.913872 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:24.913830 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74779f6785-km27b" podStartSLOduration=1.6835349499999999 podStartE2EDuration="7.913818429s" podCreationTimestamp="2026-04-21 04:24:17 +0000 UTC" firstStartedPulling="2026-04-21 04:24:18.366519236 +0000 UTC m=+37.226623193" lastFinishedPulling="2026-04-21 04:24:24.596802704 +0000 UTC m=+43.456906672" observedRunningTime="2026-04-21 04:24:24.912866599 +0000 UTC m=+43.772970579" watchObservedRunningTime="2026-04-21 04:24:24.913818429 +0000 UTC m=+43.773922408" Apr 21 04:24:27.898310 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:27.898268 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" event={"ID":"ca792d14-29bc-4254-9900-24c8dd60bb22","Type":"ContainerStarted","Data":"e3e44f77269698bd3ed8fac2c8a4dde84770dc9596d2a47c418695c2e387d4d3"} Apr 21 04:24:27.898310 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:27.898316 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" event={"ID":"ca792d14-29bc-4254-9900-24c8dd60bb22","Type":"ContainerStarted","Data":"4bd606847d5085867437b758f5cc084aa439b2d89411ae79f06ac668787253b5"} Apr 21 04:24:27.915305 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:27.915251 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" podStartSLOduration=2.23509284 podStartE2EDuration="10.915234154s" podCreationTimestamp="2026-04-21 04:24:17 +0000 UTC" firstStartedPulling="2026-04-21 04:24:18.362160784 +0000 UTC m=+37.222264741" lastFinishedPulling="2026-04-21 04:24:27.042302083 +0000 UTC m=+45.902406055" observedRunningTime="2026-04-21 04:24:27.913998714 +0000 UTC m=+46.774102711" watchObservedRunningTime="2026-04-21 04:24:27.915234154 +0000 UTC m=+46.775338135" Apr 21 04:24:28.902879 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:28.902838 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-csr9d" event={"ID":"ce0750e5-90f6-418d-9510-f807b89c746c","Type":"ContainerStarted","Data":"fdb5bce26b4c026d2d62cf9feec9d645bd7cd1385d9a45e972874c9456ad9846"} Apr 21 04:24:28.916166 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:28.916115 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-csr9d" podStartSLOduration=32.910991781 podStartE2EDuration="36.916103239s" podCreationTimestamp="2026-04-21 04:23:52 +0000 UTC" firstStartedPulling="2026-04-21 04:24:24.726555177 +0000 UTC m=+43.586659140" lastFinishedPulling="2026-04-21 04:24:28.731666638 +0000 UTC m=+47.591770598" observedRunningTime="2026-04-21 04:24:28.915660442 +0000 UTC m=+47.775764421" watchObservedRunningTime="2026-04-21 04:24:28.916103239 +0000 UTC m=+47.776207219" Apr 21 04:24:30.899955 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:30.899920 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:30.899955 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:30.899956 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert\") pod \"ingress-canary-rbw56\" (UID: \"65dc137e-873c-4d24-b876-d35819405cac\") " pod="openshift-ingress-canary/ingress-canary-rbw56" Apr 21 04:24:30.900439 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:30.900003 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:30.900439 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:30.900030 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-985mb\" (UID: \"668d3b91-6679-4a3c-af26-520078ad97bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" Apr 21 04:24:30.900439 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:30.900076 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:30.900439 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:30.900145 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls podName:9891f7eb-3b3e-4b55-8ce3-97281e95db7e nodeName:}" failed. No retries permitted until 2026-04-21 04:24:46.90012654 +0000 UTC m=+65.760230500 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls") pod "dns-default-tz8h8" (UID: "9891f7eb-3b3e-4b55-8ce3-97281e95db7e") : secret "dns-default-metrics-tls" not found Apr 21 04:24:30.900439 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:30.900171 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:30.900439 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:30.900206 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 04:24:30.900439 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:30.900184 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:24:30.900439 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:30.900234 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert podName:65dc137e-873c-4d24-b876-d35819405cac nodeName:}" failed. No retries permitted until 2026-04-21 04:24:46.900215007 +0000 UTC m=+65.760318987 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert") pod "ingress-canary-rbw56" (UID: "65dc137e-873c-4d24-b876-d35819405cac") : secret "canary-serving-cert" not found Apr 21 04:24:30.900439 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:30.900238 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55fbc49767-s2v7s: secret "image-registry-tls" not found Apr 21 04:24:30.900439 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:30.900257 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert podName:668d3b91-6679-4a3c-af26-520078ad97bd nodeName:}" failed. No retries permitted until 2026-04-21 04:24:46.900245605 +0000 UTC m=+65.760349569 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-985mb" (UID: "668d3b91-6679-4a3c-af26-520078ad97bd") : secret "networking-console-plugin-cert" not found Apr 21 04:24:30.900439 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:30.900283 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls podName:3910765c-b3b6-4194-b81f-e0ede65ac33b nodeName:}" failed. No retries permitted until 2026-04-21 04:24:46.900269463 +0000 UTC m=+65.760373434 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls") pod "image-registry-55fbc49767-s2v7s" (UID: "3910765c-b3b6-4194-b81f-e0ede65ac33b") : secret "image-registry-tls" not found Apr 21 04:24:40.847910 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:40.847877 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-szzqq" Apr 21 04:24:46.929688 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:46.929644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:24:46.929688 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:46.929701 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-985mb\" (UID: \"668d3b91-6679-4a3c-af26-520078ad97bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" Apr 21 04:24:46.930210 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:46.929806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:24:46.930210 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:46.929840 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert\") pod \"ingress-canary-rbw56\" (UID: \"65dc137e-873c-4d24-b876-d35819405cac\") " pod="openshift-ingress-canary/ingress-canary-rbw56" Apr 21 04:24:46.930210 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:46.929812 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 04:24:46.930210 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:46.929939 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:46.930210 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:46.929947 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert podName:668d3b91-6679-4a3c-af26-520078ad97bd nodeName:}" failed. No retries permitted until 2026-04-21 04:25:18.929926864 +0000 UTC m=+97.790030822 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-985mb" (UID: "668d3b91-6679-4a3c-af26-520078ad97bd") : secret "networking-console-plugin-cert" not found Apr 21 04:24:46.930210 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:46.929983 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert podName:65dc137e-873c-4d24-b876-d35819405cac nodeName:}" failed. No retries permitted until 2026-04-21 04:25:18.92997078 +0000 UTC m=+97.790074736 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert") pod "ingress-canary-rbw56" (UID: "65dc137e-873c-4d24-b876-d35819405cac") : secret "canary-serving-cert" not found Apr 21 04:24:46.930210 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:46.929839 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:24:46.930210 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:46.929998 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55fbc49767-s2v7s: secret "image-registry-tls" not found Apr 21 04:24:46.930210 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:46.930024 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls podName:3910765c-b3b6-4194-b81f-e0ede65ac33b nodeName:}" failed. No retries permitted until 2026-04-21 04:25:18.930016504 +0000 UTC m=+97.790120504 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls") pod "image-registry-55fbc49767-s2v7s" (UID: "3910765c-b3b6-4194-b81f-e0ede65ac33b") : secret "image-registry-tls" not found Apr 21 04:24:46.930210 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:46.929874 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:46.930210 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:46.930049 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls podName:9891f7eb-3b3e-4b55-8ce3-97281e95db7e nodeName:}" failed. No retries permitted until 2026-04-21 04:25:18.93004321 +0000 UTC m=+97.790147167 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls") pod "dns-default-tz8h8" (UID: "9891f7eb-3b3e-4b55-8ce3-97281e95db7e") : secret "dns-default-metrics-tls" not found Apr 21 04:24:47.434136 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:47.434094 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk6bf\" (UniqueName: \"kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf\") pod \"network-check-target-wxgvr\" (UID: \"bf24d380-8460-46e7-b05d-3783cbfac351\") " pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:24:47.434302 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:47.434155 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs\") pod \"network-metrics-daemon-bwjm7\" (UID: \"c904be64-c3a8-4c8b-97c6-75f15dbf9670\") " pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:24:47.436587 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:47.436570 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 04:24:47.436652 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:47.436637 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 04:24:47.444481 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:47.444463 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 04:24:47.444533 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:24:47.444518 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs podName:c904be64-c3a8-4c8b-97c6-75f15dbf9670 nodeName:}" failed. No retries permitted until 2026-04-21 04:25:51.444503248 +0000 UTC m=+130.304607216 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs") pod "network-metrics-daemon-bwjm7" (UID: "c904be64-c3a8-4c8b-97c6-75f15dbf9670") : secret "metrics-daemon-secret" not found Apr 21 04:24:47.446718 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:47.446704 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 04:24:47.457603 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:47.457587 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk6bf\" (UniqueName: \"kubernetes.io/projected/bf24d380-8460-46e7-b05d-3783cbfac351-kube-api-access-nk6bf\") pod \"network-check-target-wxgvr\" (UID: \"bf24d380-8460-46e7-b05d-3783cbfac351\") " pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:24:47.606026 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:47.605999 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tdp6m\"" Apr 21 04:24:47.613864 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:47.613844 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:24:47.725470 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:47.725439 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wxgvr"] Apr 21 04:24:47.728707 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:24:47.728671 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf24d380_8460_46e7_b05d_3783cbfac351.slice/crio-5b12e9097da8070e2d7b16f3b68af04eb6bfd8211ef2dbce1af4e7469c04a217 WatchSource:0}: Error finding container 5b12e9097da8070e2d7b16f3b68af04eb6bfd8211ef2dbce1af4e7469c04a217: Status 404 returned error can't find the container with id 5b12e9097da8070e2d7b16f3b68af04eb6bfd8211ef2dbce1af4e7469c04a217 Apr 21 04:24:47.950718 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:47.950625 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wxgvr" event={"ID":"bf24d380-8460-46e7-b05d-3783cbfac351","Type":"ContainerStarted","Data":"5b12e9097da8070e2d7b16f3b68af04eb6bfd8211ef2dbce1af4e7469c04a217"} Apr 21 04:24:50.960233 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:50.960189 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wxgvr" event={"ID":"bf24d380-8460-46e7-b05d-3783cbfac351","Type":"ContainerStarted","Data":"969e7f2331ea30c3646f059e75253d96dc3bd952ee0da829bd6aa51b0d96b53b"} Apr 21 04:24:50.960605 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:50.960328 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:24:50.974535 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:24:50.974484 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-wxgvr" podStartSLOduration=67.306761185 podStartE2EDuration="1m9.974470598s" podCreationTimestamp="2026-04-21 04:23:41 +0000 UTC" firstStartedPulling="2026-04-21 04:24:47.730433293 +0000 UTC m=+66.590537249" lastFinishedPulling="2026-04-21 04:24:50.398142699 +0000 UTC m=+69.258246662" observedRunningTime="2026-04-21 04:24:50.973427454 +0000 UTC m=+69.833531436" watchObservedRunningTime="2026-04-21 04:24:50.974470598 +0000 UTC m=+69.834574577" Apr 21 04:25:18.970407 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:25:18.970319 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert\") pod \"ingress-canary-rbw56\" (UID: \"65dc137e-873c-4d24-b876-d35819405cac\") " pod="openshift-ingress-canary/ingress-canary-rbw56" Apr 21 04:25:18.970407 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:25:18.970382 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:25:18.970917 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:25:18.970410 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-985mb\" (UID: \"668d3b91-6679-4a3c-af26-520078ad97bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" Apr 21 04:25:18.970917 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:25:18.970433 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:25:18.970917 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:25:18.970477 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:25:18.970917 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:25:18.970506 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:25:18.970917 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:25:18.970516 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 04:25:18.970917 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:25:18.970537 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55fbc49767-s2v7s: secret "image-registry-tls" not found Apr 21 04:25:18.970917 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:25:18.970538 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 04:25:18.970917 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:25:18.970571 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert podName:65dc137e-873c-4d24-b876-d35819405cac nodeName:}" failed. No retries permitted until 2026-04-21 04:26:22.970550013 +0000 UTC m=+161.830653990 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert") pod "ingress-canary-rbw56" (UID: "65dc137e-873c-4d24-b876-d35819405cac") : secret "canary-serving-cert" not found Apr 21 04:25:18.970917 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:25:18.970587 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls podName:3910765c-b3b6-4194-b81f-e0ede65ac33b nodeName:}" failed. No retries permitted until 2026-04-21 04:26:22.970580322 +0000 UTC m=+161.830684280 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls") pod "image-registry-55fbc49767-s2v7s" (UID: "3910765c-b3b6-4194-b81f-e0ede65ac33b") : secret "image-registry-tls" not found Apr 21 04:25:18.970917 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:25:18.970598 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert podName:668d3b91-6679-4a3c-af26-520078ad97bd nodeName:}" failed. No retries permitted until 2026-04-21 04:26:22.970593014 +0000 UTC m=+161.830696971 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-985mb" (UID: "668d3b91-6679-4a3c-af26-520078ad97bd") : secret "networking-console-plugin-cert" not found Apr 21 04:25:18.970917 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:25:18.970608 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls podName:9891f7eb-3b3e-4b55-8ce3-97281e95db7e nodeName:}" failed. No retries permitted until 2026-04-21 04:26:22.970603426 +0000 UTC m=+161.830707384 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls") pod "dns-default-tz8h8" (UID: "9891f7eb-3b3e-4b55-8ce3-97281e95db7e") : secret "dns-default-metrics-tls" not found Apr 21 04:25:21.966159 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:25:21.966126 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-wxgvr" Apr 21 04:25:51.494702 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:25:51.494643 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs\") pod \"network-metrics-daemon-bwjm7\" (UID: \"c904be64-c3a8-4c8b-97c6-75f15dbf9670\") " pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:25:51.495209 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:25:51.494820 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 04:25:51.495209 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:25:51.494891 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs podName:c904be64-c3a8-4c8b-97c6-75f15dbf9670 nodeName:}" failed. No retries permitted until 2026-04-21 04:27:53.494872056 +0000 UTC m=+252.354976033 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs") pod "network-metrics-daemon-bwjm7" (UID: "c904be64-c3a8-4c8b-97c6-75f15dbf9670") : secret "metrics-daemon-secret" not found Apr 21 04:26:01.561982 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:01.561951 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7b454_9d7bcf82-4223-4a89-a957-2d50a0af065e/dns-node-resolver/0.log" Apr 21 04:26:02.563261 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:02.563235 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pqxgt_59fe2a00-5ff4-4e40-b04e-88c5b7b9b743/node-ca/0.log" Apr 21 04:26:18.055322 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:26:18.055277 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" podUID="668d3b91-6679-4a3c-af26-520078ad97bd" Apr 21 04:26:18.063579 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:26:18.063549 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" podUID="3910765c-b3b6-4194-b81f-e0ede65ac33b" Apr 21 04:26:18.070673 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:26:18.070643 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-tz8h8" podUID="9891f7eb-3b3e-4b55-8ce3-97281e95db7e" Apr 21 04:26:18.076813 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:26:18.076793 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rbw56" podUID="65dc137e-873c-4d24-b876-d35819405cac" Apr 21 04:26:18.163493 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:18.163460 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:26:18.163493 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:18.163483 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" Apr 21 04:26:18.163714 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:18.163500 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rbw56" Apr 21 04:26:18.163714 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:18.163650 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tz8h8" Apr 21 04:26:19.694094 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:26:19.694056 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-bwjm7" podUID="c904be64-c3a8-4c8b-97c6-75f15dbf9670" Apr 21 04:26:23.037704 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.037640 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-985mb\" (UID: \"668d3b91-6679-4a3c-af26-520078ad97bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" Apr 21 04:26:23.037704 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.037714 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:26:23.038268 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.037768 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert\") pod \"ingress-canary-rbw56\" (UID: \"65dc137e-873c-4d24-b876-d35819405cac\") " pod="openshift-ingress-canary/ingress-canary-rbw56" Apr 21 04:26:23.038268 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.037838 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:26:23.040028 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.040004 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9891f7eb-3b3e-4b55-8ce3-97281e95db7e-metrics-tls\") pod \"dns-default-tz8h8\" (UID: \"9891f7eb-3b3e-4b55-8ce3-97281e95db7e\") " pod="openshift-dns/dns-default-tz8h8" Apr 21 04:26:23.040137 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.040060 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/668d3b91-6679-4a3c-af26-520078ad97bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-985mb\" (UID: \"668d3b91-6679-4a3c-af26-520078ad97bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" Apr 21 04:26:23.040390 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.040373 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/65dc137e-873c-4d24-b876-d35819405cac-cert\") pod \"ingress-canary-rbw56\" (UID: \"65dc137e-873c-4d24-b876-d35819405cac\") " pod="openshift-ingress-canary/ingress-canary-rbw56" Apr 21 04:26:23.040447 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.040393 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls\") pod \"image-registry-55fbc49767-s2v7s\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:26:23.268460 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.268428 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-tv2rf\"" Apr 21 04:26:23.268712 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.268487 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vk74d\"" Apr 21 04:26:23.268712 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.268525 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l78gs\"" Apr 21 04:26:23.268712 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.268487 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-v6cxl\"" Apr 21 04:26:23.275587 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.275558 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" Apr 21 04:26:23.275587 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.275565 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rbw56" Apr 21 04:26:23.275834 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.275685 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:26:23.275834 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.275768 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tz8h8" Apr 21 04:26:23.436573 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.436538 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55fbc49767-s2v7s"] Apr 21 04:26:23.440066 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:26:23.440035 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3910765c_b3b6_4194_b81f_e0ede65ac33b.slice/crio-7a0fc78f134ebbce8701cd0eca9899f914daf2e6545322025eeadfdf14fa1d20 WatchSource:0}: Error finding container 7a0fc78f134ebbce8701cd0eca9899f914daf2e6545322025eeadfdf14fa1d20: Status 404 returned error can't find the container with id 7a0fc78f134ebbce8701cd0eca9899f914daf2e6545322025eeadfdf14fa1d20 Apr 21 04:26:23.441943 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.441919 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-crssf"] Apr 21 04:26:23.446827 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.446805 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-crssf" Apr 21 04:26:23.450564 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.450537 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 04:26:23.451378 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.450944 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 04:26:23.451378 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.450990 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qm4pr\"" Apr 21 04:26:23.451378 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.450977 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 04:26:23.451378 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.451260 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 04:26:23.451716 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.451697 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rbw56"] Apr 21 04:26:23.456267 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:26:23.456232 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65dc137e_873c_4d24_b876_d35819405cac.slice/crio-609b0af028b60f6f292120ce54a0ad00111b3e51e6e1485384f44365ba18a643 WatchSource:0}: Error finding container 609b0af028b60f6f292120ce54a0ad00111b3e51e6e1485384f44365ba18a643: Status 404 returned error can't find the container with id 609b0af028b60f6f292120ce54a0ad00111b3e51e6e1485384f44365ba18a643 Apr 21 04:26:23.462507 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.462482 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-crssf"] Apr 21 04:26:23.542676 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.542595 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/81d1ae8a-86ba-4852-83da-1e18ebe907b1-crio-socket\") pod \"insights-runtime-extractor-crssf\" (UID: \"81d1ae8a-86ba-4852-83da-1e18ebe907b1\") " pod="openshift-insights/insights-runtime-extractor-crssf" Apr 21 04:26:23.542848 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.542687 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/81d1ae8a-86ba-4852-83da-1e18ebe907b1-data-volume\") pod \"insights-runtime-extractor-crssf\" (UID: \"81d1ae8a-86ba-4852-83da-1e18ebe907b1\") " pod="openshift-insights/insights-runtime-extractor-crssf" Apr 21 04:26:23.542848 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.542715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/81d1ae8a-86ba-4852-83da-1e18ebe907b1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-crssf\" (UID: \"81d1ae8a-86ba-4852-83da-1e18ebe907b1\") " pod="openshift-insights/insights-runtime-extractor-crssf" Apr 21 04:26:23.542848 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.542817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jsq6\" (UniqueName: \"kubernetes.io/projected/81d1ae8a-86ba-4852-83da-1e18ebe907b1-kube-api-access-6jsq6\") pod \"insights-runtime-extractor-crssf\" (UID: \"81d1ae8a-86ba-4852-83da-1e18ebe907b1\") " pod="openshift-insights/insights-runtime-extractor-crssf" Apr 21 04:26:23.542954 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.542884 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/81d1ae8a-86ba-4852-83da-1e18ebe907b1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-crssf\" (UID: \"81d1ae8a-86ba-4852-83da-1e18ebe907b1\") " pod="openshift-insights/insights-runtime-extractor-crssf" Apr 21 04:26:23.644364 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.644308 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/81d1ae8a-86ba-4852-83da-1e18ebe907b1-crio-socket\") pod \"insights-runtime-extractor-crssf\" (UID: \"81d1ae8a-86ba-4852-83da-1e18ebe907b1\") " pod="openshift-insights/insights-runtime-extractor-crssf" Apr 21 04:26:23.644549 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.644384 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/81d1ae8a-86ba-4852-83da-1e18ebe907b1-data-volume\") pod \"insights-runtime-extractor-crssf\" (UID: \"81d1ae8a-86ba-4852-83da-1e18ebe907b1\") " pod="openshift-insights/insights-runtime-extractor-crssf" Apr 21 04:26:23.644549 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.644410 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/81d1ae8a-86ba-4852-83da-1e18ebe907b1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-crssf\" (UID: \"81d1ae8a-86ba-4852-83da-1e18ebe907b1\") " pod="openshift-insights/insights-runtime-extractor-crssf" Apr 21 04:26:23.644549 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.644461 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/81d1ae8a-86ba-4852-83da-1e18ebe907b1-crio-socket\") pod \"insights-runtime-extractor-crssf\" (UID: \"81d1ae8a-86ba-4852-83da-1e18ebe907b1\") " pod="openshift-insights/insights-runtime-extractor-crssf" Apr 21 04:26:23.644549 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.644470 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jsq6\" (UniqueName: \"kubernetes.io/projected/81d1ae8a-86ba-4852-83da-1e18ebe907b1-kube-api-access-6jsq6\") pod \"insights-runtime-extractor-crssf\" (UID: \"81d1ae8a-86ba-4852-83da-1e18ebe907b1\") " pod="openshift-insights/insights-runtime-extractor-crssf" Apr 21 04:26:23.644939 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.644574 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/81d1ae8a-86ba-4852-83da-1e18ebe907b1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-crssf\" (UID: \"81d1ae8a-86ba-4852-83da-1e18ebe907b1\") " pod="openshift-insights/insights-runtime-extractor-crssf" Apr 21 04:26:23.644939 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.644794 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/81d1ae8a-86ba-4852-83da-1e18ebe907b1-data-volume\") pod \"insights-runtime-extractor-crssf\" (UID: \"81d1ae8a-86ba-4852-83da-1e18ebe907b1\") " pod="openshift-insights/insights-runtime-extractor-crssf" Apr 21 04:26:23.645035 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.645012 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/81d1ae8a-86ba-4852-83da-1e18ebe907b1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-crssf\" (UID: \"81d1ae8a-86ba-4852-83da-1e18ebe907b1\") " pod="openshift-insights/insights-runtime-extractor-crssf" Apr 21 04:26:23.646947 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.646919 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/81d1ae8a-86ba-4852-83da-1e18ebe907b1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-crssf\" (UID: \"81d1ae8a-86ba-4852-83da-1e18ebe907b1\") " pod="openshift-insights/insights-runtime-extractor-crssf" Apr 21 04:26:23.654392 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.654365 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jsq6\" (UniqueName: \"kubernetes.io/projected/81d1ae8a-86ba-4852-83da-1e18ebe907b1-kube-api-access-6jsq6\") pod \"insights-runtime-extractor-crssf\" (UID: \"81d1ae8a-86ba-4852-83da-1e18ebe907b1\") " pod="openshift-insights/insights-runtime-extractor-crssf" Apr 21 04:26:23.668563 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.668536 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tz8h8"] Apr 21 04:26:23.671967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.671945 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-985mb"] Apr 21 04:26:23.672082 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:26:23.672051 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9891f7eb_3b3e_4b55_8ce3_97281e95db7e.slice/crio-c85103eb16b6add24b85f46d2979498e4ae59639f5b61e8a536b469ea1aa562b WatchSource:0}: Error finding container c85103eb16b6add24b85f46d2979498e4ae59639f5b61e8a536b469ea1aa562b: Status 404 returned error can't find the container with id c85103eb16b6add24b85f46d2979498e4ae59639f5b61e8a536b469ea1aa562b Apr 21 04:26:23.675094 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:26:23.675067 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod668d3b91_6679_4a3c_af26_520078ad97bd.slice/crio-48914d277f14c8a889e1cc2039eccb25dd3072ac62b76a2d79113def448d2efc WatchSource:0}: Error finding container 48914d277f14c8a889e1cc2039eccb25dd3072ac62b76a2d79113def448d2efc: Status 404 returned error can't find the container with id 48914d277f14c8a889e1cc2039eccb25dd3072ac62b76a2d79113def448d2efc Apr 21 04:26:23.758297 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.758263 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-crssf" Apr 21 04:26:23.876410 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:23.876381 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-crssf"] Apr 21 04:26:23.879787 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:26:23.879755 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81d1ae8a_86ba_4852_83da_1e18ebe907b1.slice/crio-e2a70836e57abc3b6cc5f19dc7960254203dfd412e8b7092e7642c9cd963b68d WatchSource:0}: Error finding container e2a70836e57abc3b6cc5f19dc7960254203dfd412e8b7092e7642c9cd963b68d: Status 404 returned error can't find the container with id e2a70836e57abc3b6cc5f19dc7960254203dfd412e8b7092e7642c9cd963b68d Apr 21 04:26:24.179354 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:24.179308 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-crssf" event={"ID":"81d1ae8a-86ba-4852-83da-1e18ebe907b1","Type":"ContainerStarted","Data":"eedcfc973ea4c8dcc4670f3bc8134aa917c84d480a96f886779f6381b9c8f941"} Apr 21 04:26:24.179354 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:24.179349 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-crssf" event={"ID":"81d1ae8a-86ba-4852-83da-1e18ebe907b1","Type":"ContainerStarted","Data":"e2a70836e57abc3b6cc5f19dc7960254203dfd412e8b7092e7642c9cd963b68d"} Apr 21 04:26:24.180804 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:24.180431 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rbw56" event={"ID":"65dc137e-873c-4d24-b876-d35819405cac","Type":"ContainerStarted","Data":"609b0af028b60f6f292120ce54a0ad00111b3e51e6e1485384f44365ba18a643"} Apr 21 04:26:24.181618 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:24.181593 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" event={"ID":"668d3b91-6679-4a3c-af26-520078ad97bd","Type":"ContainerStarted","Data":"48914d277f14c8a889e1cc2039eccb25dd3072ac62b76a2d79113def448d2efc"} Apr 21 04:26:24.182784 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:24.182754 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tz8h8" event={"ID":"9891f7eb-3b3e-4b55-8ce3-97281e95db7e","Type":"ContainerStarted","Data":"c85103eb16b6add24b85f46d2979498e4ae59639f5b61e8a536b469ea1aa562b"} Apr 21 04:26:24.184228 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:24.184203 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" event={"ID":"3910765c-b3b6-4194-b81f-e0ede65ac33b","Type":"ContainerStarted","Data":"f75ccb028e63525148cc05f8078ca5a5c30eb0d3327f9b6e17e20ba0b35e6766"} Apr 21 04:26:24.184228 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:24.184234 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" event={"ID":"3910765c-b3b6-4194-b81f-e0ede65ac33b","Type":"ContainerStarted","Data":"7a0fc78f134ebbce8701cd0eca9899f914daf2e6545322025eeadfdf14fa1d20"} Apr 21 04:26:24.184383 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:24.184355 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:26:24.203426 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:24.203374 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" podStartSLOduration=162.203360586 podStartE2EDuration="2m42.203360586s" podCreationTimestamp="2026-04-21 04:23:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:26:24.202549551 +0000 UTC m=+163.062653531" watchObservedRunningTime="2026-04-21 04:26:24.203360586 +0000 UTC m=+163.063464566" Apr 21 04:26:24.887048 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:24.886972 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" podUID="b328e694-4fc1-4021-88a1-a6a6965396b0" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.11:8000/readyz\": dial tcp 10.132.0.11:8000: connect: connection refused" Apr 21 04:26:25.189032 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:25.188996 2574 generic.go:358] "Generic (PLEG): container finished" podID="b328e694-4fc1-4021-88a1-a6a6965396b0" containerID="cdf4bc0adc7bd909dcc0c43b937c8e2094eece8934c032237c162eb81cb4f42a" exitCode=1 Apr 21 04:26:25.189490 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:25.189072 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" event={"ID":"b328e694-4fc1-4021-88a1-a6a6965396b0","Type":"ContainerDied","Data":"cdf4bc0adc7bd909dcc0c43b937c8e2094eece8934c032237c162eb81cb4f42a"} Apr 21 04:26:25.189490 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:25.189469 2574 scope.go:117] "RemoveContainer" containerID="cdf4bc0adc7bd909dcc0c43b937c8e2094eece8934c032237c162eb81cb4f42a" Apr 21 04:26:25.190394 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:25.190374 2574 generic.go:358] "Generic (PLEG): container finished" podID="8db94413-d57f-4629-b3a5-631e0c31a8b6" containerID="7c973a0e5da89600d218a039d57da5a17f6532c655cba71a4ce1812f5ddc8ee2" exitCode=255 Apr 21 04:26:25.190527 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:25.190455 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74779f6785-km27b" event={"ID":"8db94413-d57f-4629-b3a5-631e0c31a8b6","Type":"ContainerDied","Data":"7c973a0e5da89600d218a039d57da5a17f6532c655cba71a4ce1812f5ddc8ee2"} Apr 21 04:26:25.190897 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:25.190874 2574 scope.go:117] "RemoveContainer" containerID="7c973a0e5da89600d218a039d57da5a17f6532c655cba71a4ce1812f5ddc8ee2" Apr 21 04:26:26.195421 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:26.195387 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tz8h8" event={"ID":"9891f7eb-3b3e-4b55-8ce3-97281e95db7e","Type":"ContainerStarted","Data":"1521cc53aa8744ef662332ec4b8fc6bf9db2ad032c1599b90125f54d0db7b619"} Apr 21 04:26:26.195421 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:26.195423 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tz8h8" event={"ID":"9891f7eb-3b3e-4b55-8ce3-97281e95db7e","Type":"ContainerStarted","Data":"985fa9320a9ad50a20ce7debeab60677f4e5ea83bad3586daf67fbf2eb9601f5"} Apr 21 04:26:26.195933 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:26.195501 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-tz8h8" Apr 21 04:26:26.196944 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:26.196924 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" event={"ID":"b328e694-4fc1-4021-88a1-a6a6965396b0","Type":"ContainerStarted","Data":"6d1ceb19285c8dab817497ec1a8495d4e29b13ab4a8d993f62422acf3632368a"} Apr 21 04:26:26.197189 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:26.197172 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" Apr 21 04:26:26.197851 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:26.197834 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697999f9f6-4fqmz" Apr 21 04:26:26.198521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:26.198503 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-crssf" event={"ID":"81d1ae8a-86ba-4852-83da-1e18ebe907b1","Type":"ContainerStarted","Data":"ebeaba36bfc32120ae559438b42d34fa50630e0754718c212bd01f10a3c9af85"} Apr 21 04:26:26.199765 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:26.199743 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rbw56" event={"ID":"65dc137e-873c-4d24-b876-d35819405cac","Type":"ContainerStarted","Data":"bfbe8693fa8f985ad2cad33b7febda66f701d818d60c8924f73149c9bc004070"} Apr 21 04:26:26.200874 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:26.200847 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" event={"ID":"668d3b91-6679-4a3c-af26-520078ad97bd","Type":"ContainerStarted","Data":"2f326bd687f34de2b673b5f1c4f9e7c8067ab6b23974f4c85913b05de2ae8109"} Apr 21 04:26:26.202333 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:26.202305 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-74779f6785-km27b" event={"ID":"8db94413-d57f-4629-b3a5-631e0c31a8b6","Type":"ContainerStarted","Data":"18e5d7de353197034974d0cd916c158e1e7016fca6db3774a25d7b39a25519ee"} Apr 21 04:26:26.224625 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:26.224585 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-985mb" podStartSLOduration=158.062259498 podStartE2EDuration="2m40.224573803s" podCreationTimestamp="2026-04-21 04:23:46 +0000 UTC" firstStartedPulling="2026-04-21 04:26:23.6768519 +0000 UTC m=+162.536955858" lastFinishedPulling="2026-04-21 04:26:25.839166192 +0000 UTC m=+164.699270163" observedRunningTime="2026-04-21 04:26:26.223632244 +0000 UTC m=+165.083736223" watchObservedRunningTime="2026-04-21 04:26:26.224573803 +0000 UTC m=+165.084677823" Apr 21 04:26:26.224833 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:26.224808 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tz8h8" podStartSLOduration=129.060180411 podStartE2EDuration="2m11.224799096s" podCreationTimestamp="2026-04-21 04:24:15 +0000 UTC" firstStartedPulling="2026-04-21 04:26:23.674496082 +0000 UTC m=+162.534600042" lastFinishedPulling="2026-04-21 04:26:25.839114754 +0000 UTC m=+164.699218727" observedRunningTime="2026-04-21 04:26:26.211093884 +0000 UTC m=+165.071197862" watchObservedRunningTime="2026-04-21 04:26:26.224799096 +0000 UTC m=+165.084903076" Apr 21 04:26:26.238326 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:26.238279 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rbw56" podStartSLOduration=128.857400306 podStartE2EDuration="2m11.238268322s" podCreationTimestamp="2026-04-21 04:24:15 +0000 UTC" firstStartedPulling="2026-04-21 04:26:23.45831213 +0000 UTC m=+162.318416087" lastFinishedPulling="2026-04-21 04:26:25.839180147 +0000 UTC m=+164.699284103" observedRunningTime="2026-04-21 04:26:26.236285916 +0000 UTC m=+165.096389909" watchObservedRunningTime="2026-04-21 04:26:26.238268322 +0000 UTC m=+165.098372302" Apr 21 04:26:28.208282 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:28.208239 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-crssf" event={"ID":"81d1ae8a-86ba-4852-83da-1e18ebe907b1","Type":"ContainerStarted","Data":"6ac0f8ef38e53bc37ab5690f3b6be0e9bd25a5d5fb9156d33a9ce81321367b14"} Apr 21 04:26:28.224349 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:28.224302 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-crssf" podStartSLOduration=1.938564809 podStartE2EDuration="5.224289787s" podCreationTimestamp="2026-04-21 04:26:23 +0000 UTC" firstStartedPulling="2026-04-21 04:26:23.934441746 +0000 UTC m=+162.794545706" lastFinishedPulling="2026-04-21 04:26:27.220166727 +0000 UTC m=+166.080270684" observedRunningTime="2026-04-21 04:26:28.224042902 +0000 UTC m=+167.084146881" watchObservedRunningTime="2026-04-21 04:26:28.224289787 +0000 UTC m=+167.084393766" Apr 21 04:26:31.684497 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:31.684466 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:26:36.207096 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:36.207067 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tz8h8" Apr 21 04:26:38.295437 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.295402 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-pl7sm"] Apr 21 04:26:38.300399 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.300374 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.302682 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.302656 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-bm2fq\"" Apr 21 04:26:38.302828 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.302807 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 04:26:38.302897 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.302818 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 04:26:38.303005 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.302979 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 04:26:38.303696 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.303679 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 04:26:38.303817 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.303737 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 04:26:38.303817 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.303800 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 04:26:38.460605 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.460574 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jgx6\" (UniqueName: \"kubernetes.io/projected/165e36f0-6193-4c39-a29d-a566ebef068d-kube-api-access-5jgx6\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.460791 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.460613 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.460791 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.460632 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/165e36f0-6193-4c39-a29d-a566ebef068d-metrics-client-ca\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.460791 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.460660 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/165e36f0-6193-4c39-a29d-a566ebef068d-sys\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.460791 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.460713 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/165e36f0-6193-4c39-a29d-a566ebef068d-root\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.460791 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.460768 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-accelerators-collector-config\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.460960 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.460819 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-tls\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.460960 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.460858 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-textfile\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.460960 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.460903 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-wtmp\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.562108 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.562015 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.562108 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.562056 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/165e36f0-6193-4c39-a29d-a566ebef068d-metrics-client-ca\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.562108 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.562100 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/165e36f0-6193-4c39-a29d-a566ebef068d-sys\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.562358 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.562146 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/165e36f0-6193-4c39-a29d-a566ebef068d-sys\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.562358 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.562189 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/165e36f0-6193-4c39-a29d-a566ebef068d-root\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.562358 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.562214 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-accelerators-collector-config\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.562358 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.562248 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-tls\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.562358 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.562300 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-textfile\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.562358 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.562306 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/165e36f0-6193-4c39-a29d-a566ebef068d-root\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.562627 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.562363 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-wtmp\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.562627 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.562414 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jgx6\" (UniqueName: \"kubernetes.io/projected/165e36f0-6193-4c39-a29d-a566ebef068d-kube-api-access-5jgx6\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.562627 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:26:38.562463 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 04:26:38.562627 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:26:38.562542 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-tls podName:165e36f0-6193-4c39-a29d-a566ebef068d nodeName:}" failed. No retries permitted until 2026-04-21 04:26:39.062522327 +0000 UTC m=+177.922626298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-tls") pod "node-exporter-pl7sm" (UID: "165e36f0-6193-4c39-a29d-a566ebef068d") : secret "node-exporter-tls" not found Apr 21 04:26:38.562627 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.562561 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-wtmp\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.562627 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.562571 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-textfile\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.562896 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.562755 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-accelerators-collector-config\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.563188 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.563170 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/165e36f0-6193-4c39-a29d-a566ebef068d-metrics-client-ca\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.564430 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.564413 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:38.572793 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:38.572767 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jgx6\" (UniqueName: \"kubernetes.io/projected/165e36f0-6193-4c39-a29d-a566ebef068d-kube-api-access-5jgx6\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:39.065970 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:39.065922 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-tls\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:39.068159 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:39.068140 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/165e36f0-6193-4c39-a29d-a566ebef068d-node-exporter-tls\") pod \"node-exporter-pl7sm\" (UID: \"165e36f0-6193-4c39-a29d-a566ebef068d\") " pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:39.209405 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:39.209377 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pl7sm" Apr 21 04:26:39.217834 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:26:39.217800 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod165e36f0_6193_4c39_a29d_a566ebef068d.slice/crio-86352a6cdb8d1c0e76af40e27acd2f157ab364dfbc63f9f8a43af692834bc306 WatchSource:0}: Error finding container 86352a6cdb8d1c0e76af40e27acd2f157ab364dfbc63f9f8a43af692834bc306: Status 404 returned error can't find the container with id 86352a6cdb8d1c0e76af40e27acd2f157ab364dfbc63f9f8a43af692834bc306 Apr 21 04:26:39.235655 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:39.235620 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pl7sm" event={"ID":"165e36f0-6193-4c39-a29d-a566ebef068d","Type":"ContainerStarted","Data":"86352a6cdb8d1c0e76af40e27acd2f157ab364dfbc63f9f8a43af692834bc306"} Apr 21 04:26:40.239886 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:40.239849 2574 generic.go:358] "Generic (PLEG): container finished" podID="165e36f0-6193-4c39-a29d-a566ebef068d" containerID="dc3f2ee5ad943aa1afc36817e1c4b09257c885aaa6d884f77483dede66c66fb6" exitCode=0 Apr 21 04:26:40.240327 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:40.239950 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pl7sm" event={"ID":"165e36f0-6193-4c39-a29d-a566ebef068d","Type":"ContainerDied","Data":"dc3f2ee5ad943aa1afc36817e1c4b09257c885aaa6d884f77483dede66c66fb6"} Apr 21 04:26:41.244077 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:41.244040 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pl7sm" event={"ID":"165e36f0-6193-4c39-a29d-a566ebef068d","Type":"ContainerStarted","Data":"99b199bfa023093b85ab7ce44b85fe6f7d9548624fbd2b0c21ae932efe853403"} Apr 21 04:26:41.244077 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:41.244079 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pl7sm" event={"ID":"165e36f0-6193-4c39-a29d-a566ebef068d","Type":"ContainerStarted","Data":"b08272fb7682eb9969cb6c2a30908cf25d6b17214d6d95b620cc5b722328a710"} Apr 21 04:26:41.263337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:41.263291 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-pl7sm" podStartSLOduration=2.471752766 podStartE2EDuration="3.263278153s" podCreationTimestamp="2026-04-21 04:26:38 +0000 UTC" firstStartedPulling="2026-04-21 04:26:39.219643642 +0000 UTC m=+178.079747602" lastFinishedPulling="2026-04-21 04:26:40.011169029 +0000 UTC m=+178.871272989" observedRunningTime="2026-04-21 04:26:41.262763424 +0000 UTC m=+180.122867399" watchObservedRunningTime="2026-04-21 04:26:41.263278153 +0000 UTC m=+180.123382132" Apr 21 04:26:43.280459 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:43.280423 2574 patch_prober.go:28] interesting pod/image-registry-55fbc49767-s2v7s container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 04:26:43.280838 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:43.280478 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" podUID="3910765c-b3b6-4194-b81f-e0ede65ac33b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:26:45.194680 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:45.194652 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:26:46.091011 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:46.090973 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55fbc49767-s2v7s"] Apr 21 04:26:58.202021 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:26:58.201979 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" podUID="ca792d14-29bc-4254-9900-24c8dd60bb22" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 04:27:08.201889 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:08.201845 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" podUID="ca792d14-29bc-4254-9900-24c8dd60bb22" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 04:27:11.109993 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.109922 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" podUID="3910765c-b3b6-4194-b81f-e0ede65ac33b" containerName="registry" containerID="cri-o://f75ccb028e63525148cc05f8078ca5a5c30eb0d3327f9b6e17e20ba0b35e6766" gracePeriod=30 Apr 21 04:27:11.325574 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.325542 2574 generic.go:358] "Generic (PLEG): container finished" podID="3910765c-b3b6-4194-b81f-e0ede65ac33b" containerID="f75ccb028e63525148cc05f8078ca5a5c30eb0d3327f9b6e17e20ba0b35e6766" exitCode=0 Apr 21 04:27:11.325745 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.325616 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" event={"ID":"3910765c-b3b6-4194-b81f-e0ede65ac33b","Type":"ContainerDied","Data":"f75ccb028e63525148cc05f8078ca5a5c30eb0d3327f9b6e17e20ba0b35e6766"} Apr 21 04:27:11.338565 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.338543 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rbw56_65dc137e-873c-4d24-b876-d35819405cac/serve-healthcheck-canary/0.log" Apr 21 04:27:11.342106 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.342087 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:27:11.396221 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.396141 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-bound-sa-token\") pod \"3910765c-b3b6-4194-b81f-e0ede65ac33b\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " Apr 21 04:27:11.396221 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.396197 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3910765c-b3b6-4194-b81f-e0ede65ac33b-installation-pull-secrets\") pod \"3910765c-b3b6-4194-b81f-e0ede65ac33b\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " Apr 21 04:27:11.396440 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.396227 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3910765c-b3b6-4194-b81f-e0ede65ac33b-ca-trust-extracted\") pod \"3910765c-b3b6-4194-b81f-e0ede65ac33b\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " Apr 21 04:27:11.396440 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.396289 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-certificates\") pod \"3910765c-b3b6-4194-b81f-e0ede65ac33b\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " Apr 21 04:27:11.396440 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.396312 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3910765c-b3b6-4194-b81f-e0ede65ac33b-trusted-ca\") pod \"3910765c-b3b6-4194-b81f-e0ede65ac33b\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " Apr 21 04:27:11.396440 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.396330 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rxln\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-kube-api-access-9rxln\") pod \"3910765c-b3b6-4194-b81f-e0ede65ac33b\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " Apr 21 04:27:11.396440 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.396380 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3910765c-b3b6-4194-b81f-e0ede65ac33b-image-registry-private-configuration\") pod \"3910765c-b3b6-4194-b81f-e0ede65ac33b\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " Apr 21 04:27:11.396440 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.396411 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls\") pod \"3910765c-b3b6-4194-b81f-e0ede65ac33b\" (UID: \"3910765c-b3b6-4194-b81f-e0ede65ac33b\") " Apr 21 04:27:11.396981 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.396900 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3910765c-b3b6-4194-b81f-e0ede65ac33b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3910765c-b3b6-4194-b81f-e0ede65ac33b" (UID: "3910765c-b3b6-4194-b81f-e0ede65ac33b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:27:11.397120 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.397007 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3910765c-b3b6-4194-b81f-e0ede65ac33b" (UID: "3910765c-b3b6-4194-b81f-e0ede65ac33b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:27:11.398892 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.398866 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3910765c-b3b6-4194-b81f-e0ede65ac33b" (UID: "3910765c-b3b6-4194-b81f-e0ede65ac33b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:27:11.399015 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.398988 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3910765c-b3b6-4194-b81f-e0ede65ac33b-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "3910765c-b3b6-4194-b81f-e0ede65ac33b" (UID: "3910765c-b3b6-4194-b81f-e0ede65ac33b"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:27:11.399118 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.399091 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-kube-api-access-9rxln" (OuterVolumeSpecName: "kube-api-access-9rxln") pod "3910765c-b3b6-4194-b81f-e0ede65ac33b" (UID: "3910765c-b3b6-4194-b81f-e0ede65ac33b"). InnerVolumeSpecName "kube-api-access-9rxln". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:27:11.399174 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.399109 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3910765c-b3b6-4194-b81f-e0ede65ac33b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3910765c-b3b6-4194-b81f-e0ede65ac33b" (UID: "3910765c-b3b6-4194-b81f-e0ede65ac33b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:27:11.399174 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.399146 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3910765c-b3b6-4194-b81f-e0ede65ac33b" (UID: "3910765c-b3b6-4194-b81f-e0ede65ac33b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:27:11.405323 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.405295 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3910765c-b3b6-4194-b81f-e0ede65ac33b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3910765c-b3b6-4194-b81f-e0ede65ac33b" (UID: "3910765c-b3b6-4194-b81f-e0ede65ac33b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:27:11.497851 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.497816 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-tls\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:27:11.497851 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.497845 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-bound-sa-token\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:27:11.497851 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.497854 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3910765c-b3b6-4194-b81f-e0ede65ac33b-installation-pull-secrets\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:27:11.498080 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.497864 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3910765c-b3b6-4194-b81f-e0ede65ac33b-ca-trust-extracted\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:27:11.498080 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.497874 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3910765c-b3b6-4194-b81f-e0ede65ac33b-registry-certificates\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:27:11.498080 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.497882 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3910765c-b3b6-4194-b81f-e0ede65ac33b-trusted-ca\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:27:11.498080 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.497891 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9rxln\" (UniqueName: \"kubernetes.io/projected/3910765c-b3b6-4194-b81f-e0ede65ac33b-kube-api-access-9rxln\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:27:11.498080 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:11.497901 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3910765c-b3b6-4194-b81f-e0ede65ac33b-image-registry-private-configuration\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:27:12.329080 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:12.329048 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" Apr 21 04:27:12.329536 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:12.329045 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55fbc49767-s2v7s" event={"ID":"3910765c-b3b6-4194-b81f-e0ede65ac33b","Type":"ContainerDied","Data":"7a0fc78f134ebbce8701cd0eca9899f914daf2e6545322025eeadfdf14fa1d20"} Apr 21 04:27:12.329536 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:12.329174 2574 scope.go:117] "RemoveContainer" containerID="f75ccb028e63525148cc05f8078ca5a5c30eb0d3327f9b6e17e20ba0b35e6766" Apr 21 04:27:12.344767 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:12.344744 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55fbc49767-s2v7s"] Apr 21 04:27:12.348042 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:12.348024 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-55fbc49767-s2v7s"] Apr 21 04:27:13.686293 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:13.686263 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3910765c-b3b6-4194-b81f-e0ede65ac33b" path="/var/lib/kubelet/pods/3910765c-b3b6-4194-b81f-e0ede65ac33b/volumes" Apr 21 04:27:18.201770 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:18.201708 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" podUID="ca792d14-29bc-4254-9900-24c8dd60bb22" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 04:27:18.202155 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:18.201800 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" Apr 21 04:27:18.202268 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:18.202240 2574 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"e3e44f77269698bd3ed8fac2c8a4dde84770dc9596d2a47c418695c2e387d4d3"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 21 04:27:18.202315 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:18.202288 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" podUID="ca792d14-29bc-4254-9900-24c8dd60bb22" containerName="service-proxy" containerID="cri-o://e3e44f77269698bd3ed8fac2c8a4dde84770dc9596d2a47c418695c2e387d4d3" gracePeriod=30 Apr 21 04:27:18.347337 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:18.347309 2574 generic.go:358] "Generic (PLEG): container finished" podID="ca792d14-29bc-4254-9900-24c8dd60bb22" containerID="e3e44f77269698bd3ed8fac2c8a4dde84770dc9596d2a47c418695c2e387d4d3" exitCode=2 Apr 21 04:27:18.347428 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:18.347377 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" event={"ID":"ca792d14-29bc-4254-9900-24c8dd60bb22","Type":"ContainerDied","Data":"e3e44f77269698bd3ed8fac2c8a4dde84770dc9596d2a47c418695c2e387d4d3"} Apr 21 04:27:19.351524 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:19.351491 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6549466998-j4mmq" event={"ID":"ca792d14-29bc-4254-9900-24c8dd60bb22","Type":"ContainerStarted","Data":"552317e5ccc9bba92bfe4e4db94e3c759fff45d3c8808787ee04301734bb7704"} Apr 21 04:27:53.528290 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:53.528235 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs\") pod \"network-metrics-daemon-bwjm7\" (UID: \"c904be64-c3a8-4c8b-97c6-75f15dbf9670\") " pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:27:53.530498 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:53.530477 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c904be64-c3a8-4c8b-97c6-75f15dbf9670-metrics-certs\") pod \"network-metrics-daemon-bwjm7\" (UID: \"c904be64-c3a8-4c8b-97c6-75f15dbf9670\") " pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:27:53.587482 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:53.587455 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9c9js\"" Apr 21 04:27:53.596141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:53.596121 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bwjm7" Apr 21 04:27:53.712605 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:53.712574 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bwjm7"] Apr 21 04:27:53.716656 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:27:53.716626 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc904be64_c3a8_4c8b_97c6_75f15dbf9670.slice/crio-a2c0b9bc84df5c617567d2bf1791343f09d7427a4a15a3f7a8d9f791463f5f53 WatchSource:0}: Error finding container a2c0b9bc84df5c617567d2bf1791343f09d7427a4a15a3f7a8d9f791463f5f53: Status 404 returned error can't find the container with id a2c0b9bc84df5c617567d2bf1791343f09d7427a4a15a3f7a8d9f791463f5f53 Apr 21 04:27:54.446872 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:54.446835 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bwjm7" event={"ID":"c904be64-c3a8-4c8b-97c6-75f15dbf9670","Type":"ContainerStarted","Data":"a2c0b9bc84df5c617567d2bf1791343f09d7427a4a15a3f7a8d9f791463f5f53"} Apr 21 04:27:55.451936 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:55.451900 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bwjm7" event={"ID":"c904be64-c3a8-4c8b-97c6-75f15dbf9670","Type":"ContainerStarted","Data":"212b42d78f8a60de9648b30b9edcef648fa7d673c33e33369cd65043d7aed440"} Apr 21 04:27:55.452311 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:55.451943 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bwjm7" event={"ID":"c904be64-c3a8-4c8b-97c6-75f15dbf9670","Type":"ContainerStarted","Data":"66b0f5ffc34e6844d716d28009fb9127a8c9a983f6d754783d23d8731f730ac2"} Apr 21 04:27:55.466930 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:27:55.466880 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bwjm7" podStartSLOduration=252.526170097 podStartE2EDuration="4m13.466861276s" podCreationTimestamp="2026-04-21 04:23:42 +0000 UTC" firstStartedPulling="2026-04-21 04:27:53.71835402 +0000 UTC m=+252.578457977" lastFinishedPulling="2026-04-21 04:27:54.659045197 +0000 UTC m=+253.519149156" observedRunningTime="2026-04-21 04:27:55.465499687 +0000 UTC m=+254.325603665" watchObservedRunningTime="2026-04-21 04:27:55.466861276 +0000 UTC m=+254.326965257" Apr 21 04:28:41.605778 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:28:41.605752 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/ovn-acl-logging/0.log" Apr 21 04:28:41.606333 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:28:41.605757 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/ovn-acl-logging/0.log" Apr 21 04:28:41.609317 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:28:41.609295 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 04:29:58.418605 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.416823 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9"] Apr 21 04:29:58.418605 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.417346 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3910765c-b3b6-4194-b81f-e0ede65ac33b" containerName="registry" Apr 21 04:29:58.418605 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.417364 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3910765c-b3b6-4194-b81f-e0ede65ac33b" containerName="registry" Apr 21 04:29:58.418605 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.417485 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3910765c-b3b6-4194-b81f-e0ede65ac33b" containerName="registry" Apr 21 04:29:58.421220 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.421202 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" Apr 21 04:29:58.423802 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.423773 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:29:58.423939 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.423820 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-h9cjj\"" Apr 21 04:29:58.423939 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.423831 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:29:58.426955 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.426933 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9"] Apr 21 04:29:58.583871 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.583834 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9\" (UID: \"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" Apr 21 04:29:58.584069 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.583893 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9\" (UID: \"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" Apr 21 04:29:58.584069 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.583956 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcvw7\" (UniqueName: \"kubernetes.io/projected/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-kube-api-access-fcvw7\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9\" (UID: \"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" Apr 21 04:29:58.684809 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.684710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9\" (UID: \"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" Apr 21 04:29:58.684809 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.684758 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcvw7\" (UniqueName: \"kubernetes.io/projected/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-kube-api-access-fcvw7\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9\" (UID: \"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" Apr 21 04:29:58.685016 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.684875 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9\" (UID: \"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" Apr 21 04:29:58.685167 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.685146 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9\" (UID: \"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" Apr 21 04:29:58.685225 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.685167 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9\" (UID: \"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" Apr 21 04:29:58.692560 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.692536 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcvw7\" (UniqueName: \"kubernetes.io/projected/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-kube-api-access-fcvw7\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9\" (UID: \"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" Apr 21 04:29:58.731506 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.731479 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" Apr 21 04:29:58.842325 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.842291 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9"] Apr 21 04:29:58.845526 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:29:58.845498 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c7fb800_7f1a_4493_85cf_d1c628b0ac9f.slice/crio-83c55e1e44c1c59d2048c73139e62e73c8ce27b2c43acb375bfb43d9797b13cb WatchSource:0}: Error finding container 83c55e1e44c1c59d2048c73139e62e73c8ce27b2c43acb375bfb43d9797b13cb: Status 404 returned error can't find the container with id 83c55e1e44c1c59d2048c73139e62e73c8ce27b2c43acb375bfb43d9797b13cb Apr 21 04:29:58.847357 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:58.847343 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:29:59.768216 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:29:59.768183 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" event={"ID":"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f","Type":"ContainerStarted","Data":"83c55e1e44c1c59d2048c73139e62e73c8ce27b2c43acb375bfb43d9797b13cb"} Apr 21 04:30:04.788277 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:04.788235 2574 generic.go:358] "Generic (PLEG): container finished" podID="3c7fb800-7f1a-4493-85cf-d1c628b0ac9f" containerID="810748222d89f8fb00ce1aef4077fb196a6d9c7f417009871b5db1fc12e31f15" exitCode=0 Apr 21 04:30:04.788679 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:04.788302 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" event={"ID":"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f","Type":"ContainerDied","Data":"810748222d89f8fb00ce1aef4077fb196a6d9c7f417009871b5db1fc12e31f15"} Apr 21 04:30:10.807307 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:10.807269 2574 generic.go:358] "Generic (PLEG): container finished" podID="3c7fb800-7f1a-4493-85cf-d1c628b0ac9f" containerID="519db36e2ac3bbab07ede8aa6a5bff8678cc5af60d0f953b5c61e61cfc28ccd8" exitCode=0 Apr 21 04:30:10.807695 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:10.807324 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" event={"ID":"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f","Type":"ContainerDied","Data":"519db36e2ac3bbab07ede8aa6a5bff8678cc5af60d0f953b5c61e61cfc28ccd8"} Apr 21 04:30:16.824829 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:16.824794 2574 generic.go:358] "Generic (PLEG): container finished" podID="3c7fb800-7f1a-4493-85cf-d1c628b0ac9f" containerID="e657e3c66fb682abf5db03e9ccb12c7c91dcd7ad0f719e24f763365003e46c07" exitCode=0 Apr 21 04:30:16.825200 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:16.824873 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" event={"ID":"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f","Type":"ContainerDied","Data":"e657e3c66fb682abf5db03e9ccb12c7c91dcd7ad0f719e24f763365003e46c07"} Apr 21 04:30:17.945321 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:17.945297 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" Apr 21 04:30:18.012596 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:18.012563 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-util\") pod \"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f\" (UID: \"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f\") " Apr 21 04:30:18.012764 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:18.012612 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-bundle\") pod \"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f\" (UID: \"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f\") " Apr 21 04:30:18.012764 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:18.012647 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcvw7\" (UniqueName: \"kubernetes.io/projected/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-kube-api-access-fcvw7\") pod \"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f\" (UID: \"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f\") " Apr 21 04:30:18.013356 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:18.013328 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-bundle" (OuterVolumeSpecName: "bundle") pod "3c7fb800-7f1a-4493-85cf-d1c628b0ac9f" (UID: "3c7fb800-7f1a-4493-85cf-d1c628b0ac9f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:30:18.014835 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:18.014804 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-kube-api-access-fcvw7" (OuterVolumeSpecName: "kube-api-access-fcvw7") pod "3c7fb800-7f1a-4493-85cf-d1c628b0ac9f" (UID: "3c7fb800-7f1a-4493-85cf-d1c628b0ac9f"). InnerVolumeSpecName "kube-api-access-fcvw7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:30:18.017116 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:18.017098 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-util" (OuterVolumeSpecName: "util") pod "3c7fb800-7f1a-4493-85cf-d1c628b0ac9f" (UID: "3c7fb800-7f1a-4493-85cf-d1c628b0ac9f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:30:18.113977 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:18.113888 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-bundle\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:30:18.113977 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:18.113922 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fcvw7\" (UniqueName: \"kubernetes.io/projected/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-kube-api-access-fcvw7\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:30:18.113977 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:18.113932 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c7fb800-7f1a-4493-85cf-d1c628b0ac9f-util\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:30:18.832629 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:18.832599 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" event={"ID":"3c7fb800-7f1a-4493-85cf-d1c628b0ac9f","Type":"ContainerDied","Data":"83c55e1e44c1c59d2048c73139e62e73c8ce27b2c43acb375bfb43d9797b13cb"} Apr 21 04:30:18.832629 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:18.832629 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83c55e1e44c1c59d2048c73139e62e73c8ce27b2c43acb375bfb43d9797b13cb" Apr 21 04:30:18.832852 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:18.832645 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dz6bc9" Apr 21 04:30:31.031711 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.031669 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5"] Apr 21 04:30:31.032183 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.031937 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c7fb800-7f1a-4493-85cf-d1c628b0ac9f" containerName="pull" Apr 21 04:30:31.032183 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.031949 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c7fb800-7f1a-4493-85cf-d1c628b0ac9f" containerName="pull" Apr 21 04:30:31.032183 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.031961 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c7fb800-7f1a-4493-85cf-d1c628b0ac9f" containerName="util" Apr 21 04:30:31.032183 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.031966 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c7fb800-7f1a-4493-85cf-d1c628b0ac9f" containerName="util" Apr 21 04:30:31.032183 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.031975 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c7fb800-7f1a-4493-85cf-d1c628b0ac9f" containerName="extract" Apr 21 04:30:31.032183 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.031980 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c7fb800-7f1a-4493-85cf-d1c628b0ac9f" containerName="extract" Apr 21 04:30:31.032183 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.032021 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c7fb800-7f1a-4493-85cf-d1c628b0ac9f" containerName="extract" Apr 21 04:30:31.034810 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.034793 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" Apr 21 04:30:31.037002 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.036981 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:30:31.037863 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.037850 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:30:31.037923 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.037869 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-h9cjj\"" Apr 21 04:30:31.041801 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.041781 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5"] Apr 21 04:30:31.102084 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.102044 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b703ea84-4e82-47f8-85b0-8789135ebc85-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5\" (UID: \"b703ea84-4e82-47f8-85b0-8789135ebc85\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" Apr 21 04:30:31.102084 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.102088 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhgps\" (UniqueName: \"kubernetes.io/projected/b703ea84-4e82-47f8-85b0-8789135ebc85-kube-api-access-rhgps\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5\" (UID: \"b703ea84-4e82-47f8-85b0-8789135ebc85\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" Apr 21 04:30:31.102298 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.102109 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b703ea84-4e82-47f8-85b0-8789135ebc85-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5\" (UID: \"b703ea84-4e82-47f8-85b0-8789135ebc85\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" Apr 21 04:30:31.203405 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.203367 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b703ea84-4e82-47f8-85b0-8789135ebc85-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5\" (UID: \"b703ea84-4e82-47f8-85b0-8789135ebc85\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" Apr 21 04:30:31.203405 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.203411 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhgps\" (UniqueName: \"kubernetes.io/projected/b703ea84-4e82-47f8-85b0-8789135ebc85-kube-api-access-rhgps\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5\" (UID: \"b703ea84-4e82-47f8-85b0-8789135ebc85\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" Apr 21 04:30:31.203653 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.203432 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b703ea84-4e82-47f8-85b0-8789135ebc85-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5\" (UID: \"b703ea84-4e82-47f8-85b0-8789135ebc85\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" Apr 21 04:30:31.203801 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.203784 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b703ea84-4e82-47f8-85b0-8789135ebc85-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5\" (UID: \"b703ea84-4e82-47f8-85b0-8789135ebc85\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" Apr 21 04:30:31.203889 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.203850 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b703ea84-4e82-47f8-85b0-8789135ebc85-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5\" (UID: \"b703ea84-4e82-47f8-85b0-8789135ebc85\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" Apr 21 04:30:31.210972 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.210953 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhgps\" (UniqueName: \"kubernetes.io/projected/b703ea84-4e82-47f8-85b0-8789135ebc85-kube-api-access-rhgps\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5\" (UID: \"b703ea84-4e82-47f8-85b0-8789135ebc85\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" Apr 21 04:30:31.343775 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.343715 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" Apr 21 04:30:31.457821 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.457767 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5"] Apr 21 04:30:31.462566 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:30:31.462535 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb703ea84_4e82_47f8_85b0_8789135ebc85.slice/crio-da4f2439a23ed3885ff2cfbbfe25538460c97c57444a8fe3ad54e290e390b426 WatchSource:0}: Error finding container da4f2439a23ed3885ff2cfbbfe25538460c97c57444a8fe3ad54e290e390b426: Status 404 returned error can't find the container with id da4f2439a23ed3885ff2cfbbfe25538460c97c57444a8fe3ad54e290e390b426 Apr 21 04:30:31.866448 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.866415 2574 generic.go:358] "Generic (PLEG): container finished" podID="b703ea84-4e82-47f8-85b0-8789135ebc85" containerID="db469403b0afe42996ebb584a77c2eb87c37600b294f66b25bdba3ab0dd4d2be" exitCode=0 Apr 21 04:30:31.866614 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.866499 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" event={"ID":"b703ea84-4e82-47f8-85b0-8789135ebc85","Type":"ContainerDied","Data":"db469403b0afe42996ebb584a77c2eb87c37600b294f66b25bdba3ab0dd4d2be"} Apr 21 04:30:31.866614 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:31.866538 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" event={"ID":"b703ea84-4e82-47f8-85b0-8789135ebc85","Type":"ContainerStarted","Data":"da4f2439a23ed3885ff2cfbbfe25538460c97c57444a8fe3ad54e290e390b426"} Apr 21 04:30:35.878676 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:35.878636 2574 generic.go:358] "Generic (PLEG): container finished" podID="b703ea84-4e82-47f8-85b0-8789135ebc85" containerID="ffc08b6deedc9a75d1a618991a6c697fe5859d7dd03ad8725c9658628f4ac8c4" exitCode=0 Apr 21 04:30:35.879113 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:35.878696 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" event={"ID":"b703ea84-4e82-47f8-85b0-8789135ebc85","Type":"ContainerDied","Data":"ffc08b6deedc9a75d1a618991a6c697fe5859d7dd03ad8725c9658628f4ac8c4"} Apr 21 04:30:36.883243 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:36.883206 2574 generic.go:358] "Generic (PLEG): container finished" podID="b703ea84-4e82-47f8-85b0-8789135ebc85" containerID="04f3ccbf3e8bc5e32957e3341f4a0fa2378598c03d8d1f27801e67dfba1fbedc" exitCode=0 Apr 21 04:30:36.883620 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:36.883285 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" event={"ID":"b703ea84-4e82-47f8-85b0-8789135ebc85","Type":"ContainerDied","Data":"04f3ccbf3e8bc5e32957e3341f4a0fa2378598c03d8d1f27801e67dfba1fbedc"} Apr 21 04:30:37.996863 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:37.996838 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" Apr 21 04:30:38.054648 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:38.054605 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b703ea84-4e82-47f8-85b0-8789135ebc85-util\") pod \"b703ea84-4e82-47f8-85b0-8789135ebc85\" (UID: \"b703ea84-4e82-47f8-85b0-8789135ebc85\") " Apr 21 04:30:38.054648 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:38.054654 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b703ea84-4e82-47f8-85b0-8789135ebc85-bundle\") pod \"b703ea84-4e82-47f8-85b0-8789135ebc85\" (UID: \"b703ea84-4e82-47f8-85b0-8789135ebc85\") " Apr 21 04:30:38.054911 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:38.054701 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhgps\" (UniqueName: \"kubernetes.io/projected/b703ea84-4e82-47f8-85b0-8789135ebc85-kube-api-access-rhgps\") pod \"b703ea84-4e82-47f8-85b0-8789135ebc85\" (UID: \"b703ea84-4e82-47f8-85b0-8789135ebc85\") " Apr 21 04:30:38.055060 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:38.055036 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b703ea84-4e82-47f8-85b0-8789135ebc85-bundle" (OuterVolumeSpecName: "bundle") pod "b703ea84-4e82-47f8-85b0-8789135ebc85" (UID: "b703ea84-4e82-47f8-85b0-8789135ebc85"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:30:38.056743 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:38.056706 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b703ea84-4e82-47f8-85b0-8789135ebc85-kube-api-access-rhgps" (OuterVolumeSpecName: "kube-api-access-rhgps") pod "b703ea84-4e82-47f8-85b0-8789135ebc85" (UID: "b703ea84-4e82-47f8-85b0-8789135ebc85"). InnerVolumeSpecName "kube-api-access-rhgps". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:30:38.059710 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:38.059658 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b703ea84-4e82-47f8-85b0-8789135ebc85-util" (OuterVolumeSpecName: "util") pod "b703ea84-4e82-47f8-85b0-8789135ebc85" (UID: "b703ea84-4e82-47f8-85b0-8789135ebc85"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:30:38.156025 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:38.155928 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b703ea84-4e82-47f8-85b0-8789135ebc85-util\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:30:38.156025 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:38.155970 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b703ea84-4e82-47f8-85b0-8789135ebc85-bundle\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:30:38.156025 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:38.155987 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rhgps\" (UniqueName: \"kubernetes.io/projected/b703ea84-4e82-47f8-85b0-8789135ebc85-kube-api-access-rhgps\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:30:38.890455 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:38.890419 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" event={"ID":"b703ea84-4e82-47f8-85b0-8789135ebc85","Type":"ContainerDied","Data":"da4f2439a23ed3885ff2cfbbfe25538460c97c57444a8fe3ad54e290e390b426"} Apr 21 04:30:38.890455 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:38.890452 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da4f2439a23ed3885ff2cfbbfe25538460c97c57444a8fe3ad54e290e390b426" Apr 21 04:30:38.890702 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:38.890478 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fx4cw5" Apr 21 04:30:44.862513 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:44.862476 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-2fkfz"] Apr 21 04:30:44.862911 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:44.862703 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b703ea84-4e82-47f8-85b0-8789135ebc85" containerName="util" Apr 21 04:30:44.862911 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:44.862713 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b703ea84-4e82-47f8-85b0-8789135ebc85" containerName="util" Apr 21 04:30:44.862911 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:44.862740 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b703ea84-4e82-47f8-85b0-8789135ebc85" containerName="pull" Apr 21 04:30:44.862911 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:44.862746 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b703ea84-4e82-47f8-85b0-8789135ebc85" containerName="pull" Apr 21 04:30:44.862911 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:44.862753 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b703ea84-4e82-47f8-85b0-8789135ebc85" containerName="extract" Apr 21 04:30:44.862911 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:44.862758 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b703ea84-4e82-47f8-85b0-8789135ebc85" containerName="extract" Apr 21 04:30:44.862911 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:44.862800 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b703ea84-4e82-47f8-85b0-8789135ebc85" containerName="extract" Apr 21 04:30:44.865176 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:44.865160 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-2fkfz" Apr 21 04:30:44.867277 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:44.867253 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 04:30:44.868110 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:44.868093 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-gdtkf\"" Apr 21 04:30:44.868195 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:44.868097 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 04:30:44.872871 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:44.872851 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-2fkfz"] Apr 21 04:30:44.902991 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:44.902963 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhksz\" (UniqueName: \"kubernetes.io/projected/5f59d346-27bc-4f14-b54d-4d54886f30c5-kube-api-access-mhksz\") pod \"cert-manager-79c8d999ff-2fkfz\" (UID: \"5f59d346-27bc-4f14-b54d-4d54886f30c5\") " pod="cert-manager/cert-manager-79c8d999ff-2fkfz" Apr 21 04:30:44.903105 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:44.903008 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f59d346-27bc-4f14-b54d-4d54886f30c5-bound-sa-token\") pod \"cert-manager-79c8d999ff-2fkfz\" (UID: \"5f59d346-27bc-4f14-b54d-4d54886f30c5\") " pod="cert-manager/cert-manager-79c8d999ff-2fkfz" Apr 21 04:30:45.004304 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:45.004269 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f59d346-27bc-4f14-b54d-4d54886f30c5-bound-sa-token\") pod \"cert-manager-79c8d999ff-2fkfz\" (UID: \"5f59d346-27bc-4f14-b54d-4d54886f30c5\") " pod="cert-manager/cert-manager-79c8d999ff-2fkfz" Apr 21 04:30:45.004474 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:45.004347 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhksz\" (UniqueName: \"kubernetes.io/projected/5f59d346-27bc-4f14-b54d-4d54886f30c5-kube-api-access-mhksz\") pod \"cert-manager-79c8d999ff-2fkfz\" (UID: \"5f59d346-27bc-4f14-b54d-4d54886f30c5\") " pod="cert-manager/cert-manager-79c8d999ff-2fkfz" Apr 21 04:30:45.017782 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:45.017749 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f59d346-27bc-4f14-b54d-4d54886f30c5-bound-sa-token\") pod \"cert-manager-79c8d999ff-2fkfz\" (UID: \"5f59d346-27bc-4f14-b54d-4d54886f30c5\") " pod="cert-manager/cert-manager-79c8d999ff-2fkfz" Apr 21 04:30:45.020245 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:45.020225 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhksz\" (UniqueName: \"kubernetes.io/projected/5f59d346-27bc-4f14-b54d-4d54886f30c5-kube-api-access-mhksz\") pod \"cert-manager-79c8d999ff-2fkfz\" (UID: \"5f59d346-27bc-4f14-b54d-4d54886f30c5\") " pod="cert-manager/cert-manager-79c8d999ff-2fkfz" Apr 21 04:30:45.174444 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:45.174354 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-2fkfz" Apr 21 04:30:45.291358 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:45.291331 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-2fkfz"] Apr 21 04:30:45.293882 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:30:45.293846 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f59d346_27bc_4f14_b54d_4d54886f30c5.slice/crio-7e510d7663c313336de8dc673b167a47b3e7795105b75e0720fb4eed7cd6c77c WatchSource:0}: Error finding container 7e510d7663c313336de8dc673b167a47b3e7795105b75e0720fb4eed7cd6c77c: Status 404 returned error can't find the container with id 7e510d7663c313336de8dc673b167a47b3e7795105b75e0720fb4eed7cd6c77c Apr 21 04:30:45.909715 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:45.909681 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-2fkfz" event={"ID":"5f59d346-27bc-4f14-b54d-4d54886f30c5","Type":"ContainerStarted","Data":"7e510d7663c313336de8dc673b167a47b3e7795105b75e0720fb4eed7cd6c77c"} Apr 21 04:30:48.919453 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:48.919407 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-2fkfz" event={"ID":"5f59d346-27bc-4f14-b54d-4d54886f30c5","Type":"ContainerStarted","Data":"0cdaec75e1777bd8f6ef730ecabef8f49bd5dc3986082b0ab1b7f996f30843a2"} Apr 21 04:30:48.937900 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:48.937830 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-2fkfz" podStartSLOduration=1.902763253 podStartE2EDuration="4.937812723s" podCreationTimestamp="2026-04-21 04:30:44 +0000 UTC" firstStartedPulling="2026-04-21 04:30:45.295533807 +0000 UTC m=+424.155637768" lastFinishedPulling="2026-04-21 04:30:48.33058327 +0000 UTC m=+427.190687238" observedRunningTime="2026-04-21 04:30:48.936626801 +0000 UTC m=+427.796730780" watchObservedRunningTime="2026-04-21 04:30:48.937812723 +0000 UTC m=+427.797916702" Apr 21 04:30:50.841456 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:50.841415 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx"] Apr 21 04:30:50.861939 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:50.861912 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx"] Apr 21 04:30:50.862091 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:50.862031 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" Apr 21 04:30:50.864585 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:50.864557 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:30:50.864885 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:50.864608 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:30:50.865290 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:50.865271 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-h9cjj\"" Apr 21 04:30:50.947935 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:50.947904 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb7jv\" (UniqueName: \"kubernetes.io/projected/90b0850c-5f32-49d1-9b56-83664a21dbec-kube-api-access-rb7jv\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx\" (UID: \"90b0850c-5f32-49d1-9b56-83664a21dbec\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" Apr 21 04:30:50.948111 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:50.947941 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90b0850c-5f32-49d1-9b56-83664a21dbec-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx\" (UID: \"90b0850c-5f32-49d1-9b56-83664a21dbec\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" Apr 21 04:30:50.948111 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:50.947973 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90b0850c-5f32-49d1-9b56-83664a21dbec-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx\" (UID: \"90b0850c-5f32-49d1-9b56-83664a21dbec\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" Apr 21 04:30:51.049037 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:51.049001 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rb7jv\" (UniqueName: \"kubernetes.io/projected/90b0850c-5f32-49d1-9b56-83664a21dbec-kube-api-access-rb7jv\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx\" (UID: \"90b0850c-5f32-49d1-9b56-83664a21dbec\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" Apr 21 04:30:51.049037 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:51.049040 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90b0850c-5f32-49d1-9b56-83664a21dbec-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx\" (UID: \"90b0850c-5f32-49d1-9b56-83664a21dbec\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" Apr 21 04:30:51.049290 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:51.049065 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90b0850c-5f32-49d1-9b56-83664a21dbec-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx\" (UID: \"90b0850c-5f32-49d1-9b56-83664a21dbec\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" Apr 21 04:30:51.049463 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:51.049439 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90b0850c-5f32-49d1-9b56-83664a21dbec-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx\" (UID: \"90b0850c-5f32-49d1-9b56-83664a21dbec\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" Apr 21 04:30:51.049537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:51.049453 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90b0850c-5f32-49d1-9b56-83664a21dbec-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx\" (UID: \"90b0850c-5f32-49d1-9b56-83664a21dbec\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" Apr 21 04:30:51.056803 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:51.056776 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb7jv\" (UniqueName: \"kubernetes.io/projected/90b0850c-5f32-49d1-9b56-83664a21dbec-kube-api-access-rb7jv\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx\" (UID: \"90b0850c-5f32-49d1-9b56-83664a21dbec\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" Apr 21 04:30:51.170652 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:51.170574 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" Apr 21 04:30:51.286851 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:51.286817 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx"] Apr 21 04:30:51.290881 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:30:51.290849 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90b0850c_5f32_49d1_9b56_83664a21dbec.slice/crio-35b9767749f651c8771bc0268c17aabfcc044416103cbee49bf0dbcbc8b9bd0f WatchSource:0}: Error finding container 35b9767749f651c8771bc0268c17aabfcc044416103cbee49bf0dbcbc8b9bd0f: Status 404 returned error can't find the container with id 35b9767749f651c8771bc0268c17aabfcc044416103cbee49bf0dbcbc8b9bd0f Apr 21 04:30:51.931579 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:51.931541 2574 generic.go:358] "Generic (PLEG): container finished" podID="90b0850c-5f32-49d1-9b56-83664a21dbec" containerID="97e0436fcb68c7cc800f547cb22d97e99a81cc8ea228b03f11e481fd2648b312" exitCode=0 Apr 21 04:30:51.932049 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:51.931622 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" event={"ID":"90b0850c-5f32-49d1-9b56-83664a21dbec","Type":"ContainerDied","Data":"97e0436fcb68c7cc800f547cb22d97e99a81cc8ea228b03f11e481fd2648b312"} Apr 21 04:30:51.932049 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:51.931656 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" event={"ID":"90b0850c-5f32-49d1-9b56-83664a21dbec","Type":"ContainerStarted","Data":"35b9767749f651c8771bc0268c17aabfcc044416103cbee49bf0dbcbc8b9bd0f"} Apr 21 04:30:52.937172 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:52.937137 2574 generic.go:358] "Generic (PLEG): container finished" podID="90b0850c-5f32-49d1-9b56-83664a21dbec" containerID="b6f367a660ce68a6c759805ab51b62bc9932b8da36e2928c344ec12d03232c80" exitCode=0 Apr 21 04:30:52.937545 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:52.937178 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" event={"ID":"90b0850c-5f32-49d1-9b56-83664a21dbec","Type":"ContainerDied","Data":"b6f367a660ce68a6c759805ab51b62bc9932b8da36e2928c344ec12d03232c80"} Apr 21 04:30:53.942171 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:53.942137 2574 generic.go:358] "Generic (PLEG): container finished" podID="90b0850c-5f32-49d1-9b56-83664a21dbec" containerID="e1fabbdd1cca50bbfe221b6c26ecfa329dd4d386db4b53f6302ef9d623840970" exitCode=0 Apr 21 04:30:53.942527 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:53.942190 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" event={"ID":"90b0850c-5f32-49d1-9b56-83664a21dbec","Type":"ContainerDied","Data":"e1fabbdd1cca50bbfe221b6c26ecfa329dd4d386db4b53f6302ef9d623840970"} Apr 21 04:30:55.064065 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:55.064043 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" Apr 21 04:30:55.175455 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:55.175420 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb7jv\" (UniqueName: \"kubernetes.io/projected/90b0850c-5f32-49d1-9b56-83664a21dbec-kube-api-access-rb7jv\") pod \"90b0850c-5f32-49d1-9b56-83664a21dbec\" (UID: \"90b0850c-5f32-49d1-9b56-83664a21dbec\") " Apr 21 04:30:55.175641 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:55.175491 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90b0850c-5f32-49d1-9b56-83664a21dbec-bundle\") pod \"90b0850c-5f32-49d1-9b56-83664a21dbec\" (UID: \"90b0850c-5f32-49d1-9b56-83664a21dbec\") " Apr 21 04:30:55.175641 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:55.175511 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90b0850c-5f32-49d1-9b56-83664a21dbec-util\") pod \"90b0850c-5f32-49d1-9b56-83664a21dbec\" (UID: \"90b0850c-5f32-49d1-9b56-83664a21dbec\") " Apr 21 04:30:55.176200 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:55.176171 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b0850c-5f32-49d1-9b56-83664a21dbec-bundle" (OuterVolumeSpecName: "bundle") pod "90b0850c-5f32-49d1-9b56-83664a21dbec" (UID: "90b0850c-5f32-49d1-9b56-83664a21dbec"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:30:55.177571 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:55.177547 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b0850c-5f32-49d1-9b56-83664a21dbec-kube-api-access-rb7jv" (OuterVolumeSpecName: "kube-api-access-rb7jv") pod "90b0850c-5f32-49d1-9b56-83664a21dbec" (UID: "90b0850c-5f32-49d1-9b56-83664a21dbec"). InnerVolumeSpecName "kube-api-access-rb7jv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:30:55.180837 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:55.180813 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b0850c-5f32-49d1-9b56-83664a21dbec-util" (OuterVolumeSpecName: "util") pod "90b0850c-5f32-49d1-9b56-83664a21dbec" (UID: "90b0850c-5f32-49d1-9b56-83664a21dbec"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:30:55.276174 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:55.276073 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90b0850c-5f32-49d1-9b56-83664a21dbec-bundle\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:30:55.276174 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:55.276115 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90b0850c-5f32-49d1-9b56-83664a21dbec-util\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:30:55.276174 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:55.276128 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rb7jv\" (UniqueName: \"kubernetes.io/projected/90b0850c-5f32-49d1-9b56-83664a21dbec-kube-api-access-rb7jv\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:30:55.950038 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:55.949954 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" event={"ID":"90b0850c-5f32-49d1-9b56-83664a21dbec","Type":"ContainerDied","Data":"35b9767749f651c8771bc0268c17aabfcc044416103cbee49bf0dbcbc8b9bd0f"} Apr 21 04:30:55.950038 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:55.949985 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35b9767749f651c8771bc0268c17aabfcc044416103cbee49bf0dbcbc8b9bd0f" Apr 21 04:30:55.950038 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:30:55.950004 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h68wx" Apr 21 04:31:07.812011 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.811971 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk"] Apr 21 04:31:07.812523 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.812193 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90b0850c-5f32-49d1-9b56-83664a21dbec" containerName="pull" Apr 21 04:31:07.812523 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.812203 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b0850c-5f32-49d1-9b56-83664a21dbec" containerName="pull" Apr 21 04:31:07.812523 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.812212 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90b0850c-5f32-49d1-9b56-83664a21dbec" containerName="extract" Apr 21 04:31:07.812523 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.812218 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b0850c-5f32-49d1-9b56-83664a21dbec" containerName="extract" Apr 21 04:31:07.812523 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.812232 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90b0850c-5f32-49d1-9b56-83664a21dbec" containerName="util" Apr 21 04:31:07.812523 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.812238 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b0850c-5f32-49d1-9b56-83664a21dbec" containerName="util" Apr 21 04:31:07.812523 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.812278 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="90b0850c-5f32-49d1-9b56-83664a21dbec" containerName="extract" Apr 21 04:31:07.819297 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.819277 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk" Apr 21 04:31:07.821948 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.821923 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-zqtks\"" Apr 21 04:31:07.822056 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.822040 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 04:31:07.822223 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.822197 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 04:31:07.822314 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.822289 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 04:31:07.823460 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.823442 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 04:31:07.830090 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.830068 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk"] Apr 21 04:31:07.880105 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.880066 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bac42821-e633-4da9-bb86-c0f282c8a751-webhook-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-wsldk\" (UID: \"bac42821-e633-4da9-bb86-c0f282c8a751\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk" Apr 21 04:31:07.880288 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.880120 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pm47\" (UniqueName: \"kubernetes.io/projected/bac42821-e633-4da9-bb86-c0f282c8a751-kube-api-access-2pm47\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-wsldk\" (UID: \"bac42821-e633-4da9-bb86-c0f282c8a751\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk" Apr 21 04:31:07.880288 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.880195 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bac42821-e633-4da9-bb86-c0f282c8a751-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-wsldk\" (UID: \"bac42821-e633-4da9-bb86-c0f282c8a751\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk" Apr 21 04:31:07.980600 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.980570 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bac42821-e633-4da9-bb86-c0f282c8a751-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-wsldk\" (UID: \"bac42821-e633-4da9-bb86-c0f282c8a751\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk" Apr 21 04:31:07.980600 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.980605 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bac42821-e633-4da9-bb86-c0f282c8a751-webhook-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-wsldk\" (UID: \"bac42821-e633-4da9-bb86-c0f282c8a751\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk" Apr 21 04:31:07.980860 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.980627 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pm47\" (UniqueName: \"kubernetes.io/projected/bac42821-e633-4da9-bb86-c0f282c8a751-kube-api-access-2pm47\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-wsldk\" (UID: \"bac42821-e633-4da9-bb86-c0f282c8a751\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk" Apr 21 04:31:07.982958 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.982934 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bac42821-e633-4da9-bb86-c0f282c8a751-webhook-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-wsldk\" (UID: \"bac42821-e633-4da9-bb86-c0f282c8a751\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk" Apr 21 04:31:07.983061 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.982976 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bac42821-e633-4da9-bb86-c0f282c8a751-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-wsldk\" (UID: \"bac42821-e633-4da9-bb86-c0f282c8a751\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk" Apr 21 04:31:07.990685 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:07.990661 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pm47\" (UniqueName: \"kubernetes.io/projected/bac42821-e633-4da9-bb86-c0f282c8a751-kube-api-access-2pm47\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-wsldk\" (UID: \"bac42821-e633-4da9-bb86-c0f282c8a751\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk" Apr 21 04:31:08.132152 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.132105 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk" Apr 21 04:31:08.254695 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.254664 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk"] Apr 21 04:31:08.261537 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:31:08.261502 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbac42821_e633_4da9_bb86_c0f282c8a751.slice/crio-258424b5c085f5609f0e5e9c1bdfc1243d22defee77fc0c4d6799d9617526d8c WatchSource:0}: Error finding container 258424b5c085f5609f0e5e9c1bdfc1243d22defee77fc0c4d6799d9617526d8c: Status 404 returned error can't find the container with id 258424b5c085f5609f0e5e9c1bdfc1243d22defee77fc0c4d6799d9617526d8c Apr 21 04:31:08.263269 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.263243 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh"] Apr 21 04:31:08.268666 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.268649 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" Apr 21 04:31:08.271392 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.271195 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-h9cjj\"" Apr 21 04:31:08.271392 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.271293 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:31:08.271569 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.271450 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:31:08.273032 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.273001 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh"] Apr 21 04:31:08.383299 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.383206 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh\" (UID: \"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" Apr 21 04:31:08.383299 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.383273 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh\" (UID: \"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" Apr 21 04:31:08.383483 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.383337 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvtg7\" (UniqueName: \"kubernetes.io/projected/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-kube-api-access-dvtg7\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh\" (UID: \"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" Apr 21 04:31:08.484618 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.484583 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh\" (UID: \"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" Apr 21 04:31:08.484806 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.484698 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh\" (UID: \"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" Apr 21 04:31:08.484806 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.484750 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvtg7\" (UniqueName: \"kubernetes.io/projected/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-kube-api-access-dvtg7\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh\" (UID: \"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" Apr 21 04:31:08.484886 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.484837 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh\" (UID: \"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" Apr 21 04:31:08.484999 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.484981 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh\" (UID: \"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" Apr 21 04:31:08.492322 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.492296 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvtg7\" (UniqueName: \"kubernetes.io/projected/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-kube-api-access-dvtg7\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh\" (UID: \"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" Apr 21 04:31:08.579235 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.579201 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" Apr 21 04:31:08.711081 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.711058 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh"] Apr 21 04:31:08.713331 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:31:08.713309 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf4228dd_a8f5_4c69_8cf8_e8a6cd1b780c.slice/crio-5e42cff6ff57ca1945bbd4669350994179ff89e6dcdf6daebd24c2d19584d665 WatchSource:0}: Error finding container 5e42cff6ff57ca1945bbd4669350994179ff89e6dcdf6daebd24c2d19584d665: Status 404 returned error can't find the container with id 5e42cff6ff57ca1945bbd4669350994179ff89e6dcdf6daebd24c2d19584d665 Apr 21 04:31:08.987139 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.987058 2574 generic.go:358] "Generic (PLEG): container finished" podID="df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c" containerID="68ed1f7c82f0ee2fdba3b1f6a1ab2b7c63b8349c909c93d81ca23ea0ab409af4" exitCode=0 Apr 21 04:31:08.987139 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.987139 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" event={"ID":"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c","Type":"ContainerDied","Data":"68ed1f7c82f0ee2fdba3b1f6a1ab2b7c63b8349c909c93d81ca23ea0ab409af4"} Apr 21 04:31:08.987638 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.987170 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" event={"ID":"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c","Type":"ContainerStarted","Data":"5e42cff6ff57ca1945bbd4669350994179ff89e6dcdf6daebd24c2d19584d665"} Apr 21 04:31:08.988869 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:08.988833 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk" event={"ID":"bac42821-e633-4da9-bb86-c0f282c8a751","Type":"ContainerStarted","Data":"258424b5c085f5609f0e5e9c1bdfc1243d22defee77fc0c4d6799d9617526d8c"} Apr 21 04:31:10.997393 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:10.997354 2574 generic.go:358] "Generic (PLEG): container finished" podID="df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c" containerID="721d5a85b90dffc382620ff41e03c6f23ee27e77ad2b92900d1f9e9117940798" exitCode=0 Apr 21 04:31:10.997882 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:10.997423 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" event={"ID":"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c","Type":"ContainerDied","Data":"721d5a85b90dffc382620ff41e03c6f23ee27e77ad2b92900d1f9e9117940798"} Apr 21 04:31:10.998970 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:10.998942 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk" event={"ID":"bac42821-e633-4da9-bb86-c0f282c8a751","Type":"ContainerStarted","Data":"9d488e55b35c8bb5cd90524603c8a5bbe8dfac9bdf869cbf97ce5c884c997e1e"} Apr 21 04:31:10.999086 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:10.999055 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk" Apr 21 04:31:11.039742 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:11.037442 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk" podStartSLOduration=1.495124077 podStartE2EDuration="4.037424835s" podCreationTimestamp="2026-04-21 04:31:07 +0000 UTC" firstStartedPulling="2026-04-21 04:31:08.263151923 +0000 UTC m=+447.123255879" lastFinishedPulling="2026-04-21 04:31:10.80545267 +0000 UTC m=+449.665556637" observedRunningTime="2026-04-21 04:31:11.035346194 +0000 UTC m=+449.895450174" watchObservedRunningTime="2026-04-21 04:31:11.037424835 +0000 UTC m=+449.897528814" Apr 21 04:31:12.003224 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:12.003184 2574 generic.go:358] "Generic (PLEG): container finished" podID="df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c" containerID="2cc613576b6deb49379c101ed4d3ec620a86ad9b6d6592c0f7d1a2c32d8ddccb" exitCode=0 Apr 21 04:31:12.003651 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:12.003267 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" event={"ID":"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c","Type":"ContainerDied","Data":"2cc613576b6deb49379c101ed4d3ec620a86ad9b6d6592c0f7d1a2c32d8ddccb"} Apr 21 04:31:13.125423 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.125398 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" Apr 21 04:31:13.218153 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.218113 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvtg7\" (UniqueName: \"kubernetes.io/projected/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-kube-api-access-dvtg7\") pod \"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c\" (UID: \"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c\") " Apr 21 04:31:13.218319 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.218205 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-util\") pod \"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c\" (UID: \"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c\") " Apr 21 04:31:13.218319 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.218224 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-bundle\") pod \"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c\" (UID: \"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c\") " Apr 21 04:31:13.219153 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.219126 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-bundle" (OuterVolumeSpecName: "bundle") pod "df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c" (UID: "df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:31:13.220408 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.220374 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-kube-api-access-dvtg7" (OuterVolumeSpecName: "kube-api-access-dvtg7") pod "df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c" (UID: "df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c"). InnerVolumeSpecName "kube-api-access-dvtg7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:31:13.223421 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.223396 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-util" (OuterVolumeSpecName: "util") pod "df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c" (UID: "df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:31:13.318674 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.318577 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-util\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:31:13.318674 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.318617 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-bundle\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:31:13.318674 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.318626 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dvtg7\" (UniqueName: \"kubernetes.io/projected/df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c-kube-api-access-dvtg7\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:31:13.667994 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.667962 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh"] Apr 21 04:31:13.668208 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.668197 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c" containerName="util" Apr 21 04:31:13.668254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.668209 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c" containerName="util" Apr 21 04:31:13.668254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.668218 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c" containerName="pull" Apr 21 04:31:13.668254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.668224 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c" containerName="pull" Apr 21 04:31:13.668254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.668229 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c" containerName="extract" Apr 21 04:31:13.668254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.668235 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c" containerName="extract" Apr 21 04:31:13.668402 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.668281 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c" containerName="extract" Apr 21 04:31:13.672021 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.672005 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" Apr 21 04:31:13.674263 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.674239 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 04:31:13.674380 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.674282 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 04:31:13.675150 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.675129 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 04:31:13.675272 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.675166 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-z9g9q\"" Apr 21 04:31:13.675272 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.675166 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:31:13.675407 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.675273 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 04:31:13.677788 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.677770 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh"] Apr 21 04:31:13.822028 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.821996 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8142b997-841e-4633-ac0f-3a0a600c0bca-manager-config\") pod \"lws-controller-manager-579f6d4cb9-jbdjh\" (UID: \"8142b997-841e-4633-ac0f-3a0a600c0bca\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" Apr 21 04:31:13.822028 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.822028 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8142b997-841e-4633-ac0f-3a0a600c0bca-metrics-cert\") pod \"lws-controller-manager-579f6d4cb9-jbdjh\" (UID: \"8142b997-841e-4633-ac0f-3a0a600c0bca\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" Apr 21 04:31:13.822225 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.822063 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrtsd\" (UniqueName: \"kubernetes.io/projected/8142b997-841e-4633-ac0f-3a0a600c0bca-kube-api-access-vrtsd\") pod \"lws-controller-manager-579f6d4cb9-jbdjh\" (UID: \"8142b997-841e-4633-ac0f-3a0a600c0bca\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" Apr 21 04:31:13.822225 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.822114 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8142b997-841e-4633-ac0f-3a0a600c0bca-cert\") pod \"lws-controller-manager-579f6d4cb9-jbdjh\" (UID: \"8142b997-841e-4633-ac0f-3a0a600c0bca\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" Apr 21 04:31:13.923074 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.922997 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8142b997-841e-4633-ac0f-3a0a600c0bca-manager-config\") pod \"lws-controller-manager-579f6d4cb9-jbdjh\" (UID: \"8142b997-841e-4633-ac0f-3a0a600c0bca\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" Apr 21 04:31:13.923074 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.923030 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8142b997-841e-4633-ac0f-3a0a600c0bca-metrics-cert\") pod \"lws-controller-manager-579f6d4cb9-jbdjh\" (UID: \"8142b997-841e-4633-ac0f-3a0a600c0bca\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" Apr 21 04:31:13.923074 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.923054 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrtsd\" (UniqueName: \"kubernetes.io/projected/8142b997-841e-4633-ac0f-3a0a600c0bca-kube-api-access-vrtsd\") pod \"lws-controller-manager-579f6d4cb9-jbdjh\" (UID: \"8142b997-841e-4633-ac0f-3a0a600c0bca\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" Apr 21 04:31:13.923325 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.923078 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8142b997-841e-4633-ac0f-3a0a600c0bca-cert\") pod \"lws-controller-manager-579f6d4cb9-jbdjh\" (UID: \"8142b997-841e-4633-ac0f-3a0a600c0bca\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" Apr 21 04:31:13.923695 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.923661 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8142b997-841e-4633-ac0f-3a0a600c0bca-manager-config\") pod \"lws-controller-manager-579f6d4cb9-jbdjh\" (UID: \"8142b997-841e-4633-ac0f-3a0a600c0bca\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" Apr 21 04:31:13.925544 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.925526 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8142b997-841e-4633-ac0f-3a0a600c0bca-metrics-cert\") pod \"lws-controller-manager-579f6d4cb9-jbdjh\" (UID: \"8142b997-841e-4633-ac0f-3a0a600c0bca\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" Apr 21 04:31:13.925656 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.925637 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8142b997-841e-4633-ac0f-3a0a600c0bca-cert\") pod \"lws-controller-manager-579f6d4cb9-jbdjh\" (UID: \"8142b997-841e-4633-ac0f-3a0a600c0bca\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" Apr 21 04:31:13.932211 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.932190 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrtsd\" (UniqueName: \"kubernetes.io/projected/8142b997-841e-4633-ac0f-3a0a600c0bca-kube-api-access-vrtsd\") pod \"lws-controller-manager-579f6d4cb9-jbdjh\" (UID: \"8142b997-841e-4633-ac0f-3a0a600c0bca\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" Apr 21 04:31:13.982194 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:13.982164 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" Apr 21 04:31:14.013188 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:14.013153 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" event={"ID":"df4228dd-a8f5-4c69-8cf8-e8a6cd1b780c","Type":"ContainerDied","Data":"5e42cff6ff57ca1945bbd4669350994179ff89e6dcdf6daebd24c2d19584d665"} Apr 21 04:31:14.013188 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:14.013186 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e42cff6ff57ca1945bbd4669350994179ff89e6dcdf6daebd24c2d19584d665" Apr 21 04:31:14.013345 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:14.013209 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xgwwh" Apr 21 04:31:14.097933 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:14.097908 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh"] Apr 21 04:31:14.100270 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:31:14.100243 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8142b997_841e_4633_ac0f_3a0a600c0bca.slice/crio-d87f9994b0d42f3a6387094d8ed0f7c03932b315e38d2cc74f91f8ae5e2b5799 WatchSource:0}: Error finding container d87f9994b0d42f3a6387094d8ed0f7c03932b315e38d2cc74f91f8ae5e2b5799: Status 404 returned error can't find the container with id d87f9994b0d42f3a6387094d8ed0f7c03932b315e38d2cc74f91f8ae5e2b5799 Apr 21 04:31:15.018247 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:15.018207 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" event={"ID":"8142b997-841e-4633-ac0f-3a0a600c0bca","Type":"ContainerStarted","Data":"d87f9994b0d42f3a6387094d8ed0f7c03932b315e38d2cc74f91f8ae5e2b5799"} Apr 21 04:31:16.022393 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:16.022302 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" event={"ID":"8142b997-841e-4633-ac0f-3a0a600c0bca","Type":"ContainerStarted","Data":"9ed7f6cf615b00d652bdc6e8acd98886358da1e4cf4c5071691dbe1d2b5aef52"} Apr 21 04:31:16.022393 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:16.022359 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" Apr 21 04:31:16.038569 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:16.038518 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" podStartSLOduration=1.5457422969999999 podStartE2EDuration="3.038502464s" podCreationTimestamp="2026-04-21 04:31:13 +0000 UTC" firstStartedPulling="2026-04-21 04:31:14.102471121 +0000 UTC m=+452.962575082" lastFinishedPulling="2026-04-21 04:31:15.595231284 +0000 UTC m=+454.455335249" observedRunningTime="2026-04-21 04:31:16.03722958 +0000 UTC m=+454.897333560" watchObservedRunningTime="2026-04-21 04:31:16.038502464 +0000 UTC m=+454.898606444" Apr 21 04:31:22.005851 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:22.005820 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-wsldk" Apr 21 04:31:26.840627 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:26.840590 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf"] Apr 21 04:31:26.845233 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:26.845216 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" Apr 21 04:31:26.847530 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:26.847506 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-h9cjj\"" Apr 21 04:31:26.847659 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:26.847506 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:31:26.847659 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:26.847514 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:31:26.850739 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:26.850703 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf"] Apr 21 04:31:26.922738 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:26.922673 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77619bca-8f0f-46c9-b499-2d423a06aab7-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf\" (UID: \"77619bca-8f0f-46c9-b499-2d423a06aab7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" Apr 21 04:31:26.922901 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:26.922779 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsqx7\" (UniqueName: \"kubernetes.io/projected/77619bca-8f0f-46c9-b499-2d423a06aab7-kube-api-access-hsqx7\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf\" (UID: \"77619bca-8f0f-46c9-b499-2d423a06aab7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" Apr 21 04:31:26.922901 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:26.922806 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77619bca-8f0f-46c9-b499-2d423a06aab7-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf\" (UID: \"77619bca-8f0f-46c9-b499-2d423a06aab7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" Apr 21 04:31:27.024122 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:27.024085 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsqx7\" (UniqueName: \"kubernetes.io/projected/77619bca-8f0f-46c9-b499-2d423a06aab7-kube-api-access-hsqx7\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf\" (UID: \"77619bca-8f0f-46c9-b499-2d423a06aab7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" Apr 21 04:31:27.024295 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:27.024129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77619bca-8f0f-46c9-b499-2d423a06aab7-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf\" (UID: \"77619bca-8f0f-46c9-b499-2d423a06aab7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" Apr 21 04:31:27.024295 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:27.024248 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77619bca-8f0f-46c9-b499-2d423a06aab7-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf\" (UID: \"77619bca-8f0f-46c9-b499-2d423a06aab7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" Apr 21 04:31:27.024467 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:27.024453 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77619bca-8f0f-46c9-b499-2d423a06aab7-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf\" (UID: \"77619bca-8f0f-46c9-b499-2d423a06aab7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" Apr 21 04:31:27.024566 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:27.024549 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77619bca-8f0f-46c9-b499-2d423a06aab7-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf\" (UID: \"77619bca-8f0f-46c9-b499-2d423a06aab7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" Apr 21 04:31:27.027765 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:27.027742 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-jbdjh" Apr 21 04:31:27.031677 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:27.031650 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsqx7\" (UniqueName: \"kubernetes.io/projected/77619bca-8f0f-46c9-b499-2d423a06aab7-kube-api-access-hsqx7\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf\" (UID: \"77619bca-8f0f-46c9-b499-2d423a06aab7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" Apr 21 04:31:27.155495 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:27.155383 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" Apr 21 04:31:27.272326 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:27.272220 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf"] Apr 21 04:31:27.274970 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:31:27.274943 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77619bca_8f0f_46c9_b499_2d423a06aab7.slice/crio-f0b18055b01d3543701f21791004dc731c8dc348b9bc56ca361c16f4d2b91b17 WatchSource:0}: Error finding container f0b18055b01d3543701f21791004dc731c8dc348b9bc56ca361c16f4d2b91b17: Status 404 returned error can't find the container with id f0b18055b01d3543701f21791004dc731c8dc348b9bc56ca361c16f4d2b91b17 Apr 21 04:31:28.059909 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:28.059876 2574 generic.go:358] "Generic (PLEG): container finished" podID="77619bca-8f0f-46c9-b499-2d423a06aab7" containerID="97c3a50857406bec1f048f3c172182bc274580571679db67921ddf0c95cf132e" exitCode=0 Apr 21 04:31:28.060240 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:28.059925 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" event={"ID":"77619bca-8f0f-46c9-b499-2d423a06aab7","Type":"ContainerDied","Data":"97c3a50857406bec1f048f3c172182bc274580571679db67921ddf0c95cf132e"} Apr 21 04:31:28.060240 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:28.059946 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" event={"ID":"77619bca-8f0f-46c9-b499-2d423a06aab7","Type":"ContainerStarted","Data":"f0b18055b01d3543701f21791004dc731c8dc348b9bc56ca361c16f4d2b91b17"} Apr 21 04:31:29.064774 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:29.064739 2574 generic.go:358] "Generic (PLEG): container finished" podID="77619bca-8f0f-46c9-b499-2d423a06aab7" containerID="591662a175558227590925ea6f344297debc7353a1c29a0857d3bea3210a37a4" exitCode=0 Apr 21 04:31:29.064774 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:29.064754 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" event={"ID":"77619bca-8f0f-46c9-b499-2d423a06aab7","Type":"ContainerDied","Data":"591662a175558227590925ea6f344297debc7353a1c29a0857d3bea3210a37a4"} Apr 21 04:31:30.070043 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:30.070006 2574 generic.go:358] "Generic (PLEG): container finished" podID="77619bca-8f0f-46c9-b499-2d423a06aab7" containerID="e31acb23ec5d9f940f5cdda6755eec47197a609086c6f0daf26161376c44e3c4" exitCode=0 Apr 21 04:31:30.070424 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:30.070077 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" event={"ID":"77619bca-8f0f-46c9-b499-2d423a06aab7","Type":"ContainerDied","Data":"e31acb23ec5d9f940f5cdda6755eec47197a609086c6f0daf26161376c44e3c4"} Apr 21 04:31:31.195519 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:31.195482 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" Apr 21 04:31:31.358283 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:31.358250 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77619bca-8f0f-46c9-b499-2d423a06aab7-util\") pod \"77619bca-8f0f-46c9-b499-2d423a06aab7\" (UID: \"77619bca-8f0f-46c9-b499-2d423a06aab7\") " Apr 21 04:31:31.358437 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:31.358319 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77619bca-8f0f-46c9-b499-2d423a06aab7-bundle\") pod \"77619bca-8f0f-46c9-b499-2d423a06aab7\" (UID: \"77619bca-8f0f-46c9-b499-2d423a06aab7\") " Apr 21 04:31:31.358437 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:31.358360 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsqx7\" (UniqueName: \"kubernetes.io/projected/77619bca-8f0f-46c9-b499-2d423a06aab7-kube-api-access-hsqx7\") pod \"77619bca-8f0f-46c9-b499-2d423a06aab7\" (UID: \"77619bca-8f0f-46c9-b499-2d423a06aab7\") " Apr 21 04:31:31.359175 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:31.359148 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77619bca-8f0f-46c9-b499-2d423a06aab7-bundle" (OuterVolumeSpecName: "bundle") pod "77619bca-8f0f-46c9-b499-2d423a06aab7" (UID: "77619bca-8f0f-46c9-b499-2d423a06aab7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:31:31.360432 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:31.360404 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77619bca-8f0f-46c9-b499-2d423a06aab7-kube-api-access-hsqx7" (OuterVolumeSpecName: "kube-api-access-hsqx7") pod "77619bca-8f0f-46c9-b499-2d423a06aab7" (UID: "77619bca-8f0f-46c9-b499-2d423a06aab7"). InnerVolumeSpecName "kube-api-access-hsqx7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:31:31.363953 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:31.363932 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77619bca-8f0f-46c9-b499-2d423a06aab7-util" (OuterVolumeSpecName: "util") pod "77619bca-8f0f-46c9-b499-2d423a06aab7" (UID: "77619bca-8f0f-46c9-b499-2d423a06aab7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:31:31.459474 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:31.459444 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77619bca-8f0f-46c9-b499-2d423a06aab7-bundle\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:31:31.459474 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:31.459467 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hsqx7\" (UniqueName: \"kubernetes.io/projected/77619bca-8f0f-46c9-b499-2d423a06aab7-kube-api-access-hsqx7\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:31:31.459474 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:31.459477 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77619bca-8f0f-46c9-b499-2d423a06aab7-util\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:31:32.077532 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:32.077500 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" event={"ID":"77619bca-8f0f-46c9-b499-2d423a06aab7","Type":"ContainerDied","Data":"f0b18055b01d3543701f21791004dc731c8dc348b9bc56ca361c16f4d2b91b17"} Apr 21 04:31:32.077679 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:32.077533 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hm4gf" Apr 21 04:31:32.077679 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:32.077541 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0b18055b01d3543701f21791004dc731c8dc348b9bc56ca361c16f4d2b91b17" Apr 21 04:31:36.783332 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:36.783299 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c"] Apr 21 04:31:36.783747 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:36.783542 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77619bca-8f0f-46c9-b499-2d423a06aab7" containerName="extract" Apr 21 04:31:36.783747 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:36.783552 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="77619bca-8f0f-46c9-b499-2d423a06aab7" containerName="extract" Apr 21 04:31:36.783747 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:36.783560 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77619bca-8f0f-46c9-b499-2d423a06aab7" containerName="util" Apr 21 04:31:36.783747 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:36.783566 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="77619bca-8f0f-46c9-b499-2d423a06aab7" containerName="util" Apr 21 04:31:36.783747 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:36.783576 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77619bca-8f0f-46c9-b499-2d423a06aab7" containerName="pull" Apr 21 04:31:36.783747 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:36.783582 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="77619bca-8f0f-46c9-b499-2d423a06aab7" containerName="pull" Apr 21 04:31:36.783747 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:36.783634 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="77619bca-8f0f-46c9-b499-2d423a06aab7" containerName="extract" Apr 21 04:31:36.787713 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:36.787697 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" Apr 21 04:31:36.790534 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:36.790508 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:31:36.791074 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:36.791050 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-h9cjj\"" Apr 21 04:31:36.792343 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:36.792328 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:31:36.800394 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:36.800374 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c"] Apr 21 04:31:36.902585 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:36.902548 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-484x2\" (UniqueName: \"kubernetes.io/projected/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-kube-api-access-484x2\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c\" (UID: \"703beb12-f770-4dfe-8ac3-a1a55e66f1c3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" Apr 21 04:31:36.902768 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:36.902596 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c\" (UID: \"703beb12-f770-4dfe-8ac3-a1a55e66f1c3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" Apr 21 04:31:36.902768 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:36.902697 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c\" (UID: \"703beb12-f770-4dfe-8ac3-a1a55e66f1c3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" Apr 21 04:31:37.003697 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:37.003666 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-484x2\" (UniqueName: \"kubernetes.io/projected/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-kube-api-access-484x2\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c\" (UID: \"703beb12-f770-4dfe-8ac3-a1a55e66f1c3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" Apr 21 04:31:37.003817 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:37.003712 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c\" (UID: \"703beb12-f770-4dfe-8ac3-a1a55e66f1c3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" Apr 21 04:31:37.003817 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:37.003786 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c\" (UID: \"703beb12-f770-4dfe-8ac3-a1a55e66f1c3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" Apr 21 04:31:37.004127 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:37.004104 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c\" (UID: \"703beb12-f770-4dfe-8ac3-a1a55e66f1c3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" Apr 21 04:31:37.004127 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:37.004120 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c\" (UID: \"703beb12-f770-4dfe-8ac3-a1a55e66f1c3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" Apr 21 04:31:37.018585 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:37.018548 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-484x2\" (UniqueName: \"kubernetes.io/projected/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-kube-api-access-484x2\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c\" (UID: \"703beb12-f770-4dfe-8ac3-a1a55e66f1c3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" Apr 21 04:31:37.096619 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:37.096590 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" Apr 21 04:31:37.248172 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:37.248137 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c"] Apr 21 04:31:37.251292 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:31:37.251265 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod703beb12_f770_4dfe_8ac3_a1a55e66f1c3.slice/crio-919ba0a470754f55f73e398a44f76a2231454a5eb62ea2f803b5b47330e36692 WatchSource:0}: Error finding container 919ba0a470754f55f73e398a44f76a2231454a5eb62ea2f803b5b47330e36692: Status 404 returned error can't find the container with id 919ba0a470754f55f73e398a44f76a2231454a5eb62ea2f803b5b47330e36692 Apr 21 04:31:38.097127 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:38.097095 2574 generic.go:358] "Generic (PLEG): container finished" podID="703beb12-f770-4dfe-8ac3-a1a55e66f1c3" containerID="2c13e1eb7ba681c31cc1eb01f24b8ade17c38415af5e02e34c65813004558e7d" exitCode=0 Apr 21 04:31:38.097490 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:38.097170 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" event={"ID":"703beb12-f770-4dfe-8ac3-a1a55e66f1c3","Type":"ContainerDied","Data":"2c13e1eb7ba681c31cc1eb01f24b8ade17c38415af5e02e34c65813004558e7d"} Apr 21 04:31:38.097490 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:38.097192 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" event={"ID":"703beb12-f770-4dfe-8ac3-a1a55e66f1c3","Type":"ContainerStarted","Data":"919ba0a470754f55f73e398a44f76a2231454a5eb62ea2f803b5b47330e36692"} Apr 21 04:31:39.101784 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:39.101750 2574 generic.go:358] "Generic (PLEG): container finished" podID="703beb12-f770-4dfe-8ac3-a1a55e66f1c3" containerID="293d2c772ba1c35c1069d8d5d25cc4d077b0c99044a31075f21d43630f8b07ce" exitCode=0 Apr 21 04:31:39.102151 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:39.101817 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" event={"ID":"703beb12-f770-4dfe-8ac3-a1a55e66f1c3","Type":"ContainerDied","Data":"293d2c772ba1c35c1069d8d5d25cc4d077b0c99044a31075f21d43630f8b07ce"} Apr 21 04:31:40.107597 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:40.107564 2574 generic.go:358] "Generic (PLEG): container finished" podID="703beb12-f770-4dfe-8ac3-a1a55e66f1c3" containerID="3fb752fcfb2b96e3950ff970b69b4a1a1f24af30fb42c4651ddc6a1960d6b5d7" exitCode=0 Apr 21 04:31:40.108030 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:40.107656 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" event={"ID":"703beb12-f770-4dfe-8ac3-a1a55e66f1c3","Type":"ContainerDied","Data":"3fb752fcfb2b96e3950ff970b69b4a1a1f24af30fb42c4651ddc6a1960d6b5d7"} Apr 21 04:31:41.227035 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:41.227011 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" Apr 21 04:31:41.336537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:41.336505 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-484x2\" (UniqueName: \"kubernetes.io/projected/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-kube-api-access-484x2\") pod \"703beb12-f770-4dfe-8ac3-a1a55e66f1c3\" (UID: \"703beb12-f770-4dfe-8ac3-a1a55e66f1c3\") " Apr 21 04:31:41.336537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:41.336548 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-bundle\") pod \"703beb12-f770-4dfe-8ac3-a1a55e66f1c3\" (UID: \"703beb12-f770-4dfe-8ac3-a1a55e66f1c3\") " Apr 21 04:31:41.336784 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:41.336590 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-util\") pod \"703beb12-f770-4dfe-8ac3-a1a55e66f1c3\" (UID: \"703beb12-f770-4dfe-8ac3-a1a55e66f1c3\") " Apr 21 04:31:41.337466 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:41.337439 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-bundle" (OuterVolumeSpecName: "bundle") pod "703beb12-f770-4dfe-8ac3-a1a55e66f1c3" (UID: "703beb12-f770-4dfe-8ac3-a1a55e66f1c3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:31:41.338676 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:41.338645 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-kube-api-access-484x2" (OuterVolumeSpecName: "kube-api-access-484x2") pod "703beb12-f770-4dfe-8ac3-a1a55e66f1c3" (UID: "703beb12-f770-4dfe-8ac3-a1a55e66f1c3"). InnerVolumeSpecName "kube-api-access-484x2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:31:41.342222 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:41.342199 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-util" (OuterVolumeSpecName: "util") pod "703beb12-f770-4dfe-8ac3-a1a55e66f1c3" (UID: "703beb12-f770-4dfe-8ac3-a1a55e66f1c3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:31:41.437711 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:41.437634 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-util\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:31:41.437711 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:41.437657 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-484x2\" (UniqueName: \"kubernetes.io/projected/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-kube-api-access-484x2\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:31:41.437711 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:41.437668 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/703beb12-f770-4dfe-8ac3-a1a55e66f1c3-bundle\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:31:42.118078 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:42.118056 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" Apr 21 04:31:42.118078 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:42.118065 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2fx44c" event={"ID":"703beb12-f770-4dfe-8ac3-a1a55e66f1c3","Type":"ContainerDied","Data":"919ba0a470754f55f73e398a44f76a2231454a5eb62ea2f803b5b47330e36692"} Apr 21 04:31:42.118268 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:42.118094 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="919ba0a470754f55f73e398a44f76a2231454a5eb62ea2f803b5b47330e36692" Apr 21 04:31:57.158617 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.158581 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6"] Apr 21 04:31:57.159115 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.158962 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="703beb12-f770-4dfe-8ac3-a1a55e66f1c3" containerName="util" Apr 21 04:31:57.159115 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.158977 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="703beb12-f770-4dfe-8ac3-a1a55e66f1c3" containerName="util" Apr 21 04:31:57.159115 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.158992 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="703beb12-f770-4dfe-8ac3-a1a55e66f1c3" containerName="pull" Apr 21 04:31:57.159115 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.159000 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="703beb12-f770-4dfe-8ac3-a1a55e66f1c3" containerName="pull" Apr 21 04:31:57.159115 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.159020 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="703beb12-f770-4dfe-8ac3-a1a55e66f1c3" containerName="extract" Apr 21 04:31:57.159115 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.159029 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="703beb12-f770-4dfe-8ac3-a1a55e66f1c3" containerName="extract" Apr 21 04:31:57.159115 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.159094 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="703beb12-f770-4dfe-8ac3-a1a55e66f1c3" containerName="extract" Apr 21 04:31:57.169669 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.169649 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.171962 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.171933 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 04:31:57.172105 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.172050 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 04:31:57.172386 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.172364 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 04:31:57.172517 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.172369 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-kj4nl\"" Apr 21 04:31:57.173176 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.173152 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6"] Apr 21 04:31:57.242558 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.242521 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cbe8fefc-8e21-492e-8023-06c30c0e6259-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.242739 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.242571 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xf2s\" (UniqueName: \"kubernetes.io/projected/cbe8fefc-8e21-492e-8023-06c30c0e6259-kube-api-access-7xf2s\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.242739 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.242624 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cbe8fefc-8e21-492e-8023-06c30c0e6259-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.242739 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.242680 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cbe8fefc-8e21-492e-8023-06c30c0e6259-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.242739 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.242699 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cbe8fefc-8e21-492e-8023-06c30c0e6259-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.242739 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.242716 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cbe8fefc-8e21-492e-8023-06c30c0e6259-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.242926 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.242765 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbe8fefc-8e21-492e-8023-06c30c0e6259-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.242926 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.242814 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cbe8fefc-8e21-492e-8023-06c30c0e6259-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.242926 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.242846 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbe8fefc-8e21-492e-8023-06c30c0e6259-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.343297 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.343259 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbe8fefc-8e21-492e-8023-06c30c0e6259-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.343482 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.343307 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cbe8fefc-8e21-492e-8023-06c30c0e6259-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.343482 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.343332 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbe8fefc-8e21-492e-8023-06c30c0e6259-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.343482 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.343359 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cbe8fefc-8e21-492e-8023-06c30c0e6259-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.343482 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.343382 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xf2s\" (UniqueName: \"kubernetes.io/projected/cbe8fefc-8e21-492e-8023-06c30c0e6259-kube-api-access-7xf2s\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.343482 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.343413 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cbe8fefc-8e21-492e-8023-06c30c0e6259-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.343776 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.343557 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cbe8fefc-8e21-492e-8023-06c30c0e6259-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.343776 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.343603 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cbe8fefc-8e21-492e-8023-06c30c0e6259-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.343776 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.343633 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cbe8fefc-8e21-492e-8023-06c30c0e6259-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.343776 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.343747 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbe8fefc-8e21-492e-8023-06c30c0e6259-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.343989 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.343808 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbe8fefc-8e21-492e-8023-06c30c0e6259-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.343989 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.343974 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cbe8fefc-8e21-492e-8023-06c30c0e6259-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.344092 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.344057 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cbe8fefc-8e21-492e-8023-06c30c0e6259-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.344169 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.344150 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cbe8fefc-8e21-492e-8023-06c30c0e6259-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.345752 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.345718 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cbe8fefc-8e21-492e-8023-06c30c0e6259-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.345908 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.345890 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cbe8fefc-8e21-492e-8023-06c30c0e6259-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.350464 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.350442 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cbe8fefc-8e21-492e-8023-06c30c0e6259-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.351156 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.351136 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xf2s\" (UniqueName: \"kubernetes.io/projected/cbe8fefc-8e21-492e-8023-06c30c0e6259-kube-api-access-7xf2s\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6\" (UID: \"cbe8fefc-8e21-492e-8023-06c30c0e6259\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.481416 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.481332 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:31:57.611600 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:57.611572 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6"] Apr 21 04:31:57.613958 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:31:57.613930 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbe8fefc_8e21_492e_8023_06c30c0e6259.slice/crio-8693b14a4aa4baa2060554805576b1be816c6ae3f5cba66034bc400479ee6101 WatchSource:0}: Error finding container 8693b14a4aa4baa2060554805576b1be816c6ae3f5cba66034bc400479ee6101: Status 404 returned error can't find the container with id 8693b14a4aa4baa2060554805576b1be816c6ae3f5cba66034bc400479ee6101 Apr 21 04:31:58.170710 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:31:58.170666 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" event={"ID":"cbe8fefc-8e21-492e-8023-06c30c0e6259","Type":"ContainerStarted","Data":"8693b14a4aa4baa2060554805576b1be816c6ae3f5cba66034bc400479ee6101"} Apr 21 04:32:02.168573 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:02.168534 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 04:32:02.168879 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:02.168609 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 04:32:02.168879 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:02.168635 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 04:32:03.187904 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:03.187867 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" event={"ID":"cbe8fefc-8e21-492e-8023-06c30c0e6259","Type":"ContainerStarted","Data":"971ae9bb5188bd2bb6f834239a50dc08bccf7a4383a2e78db9d86b052e931481"} Apr 21 04:32:03.211308 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:03.211251 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" podStartSLOduration=1.6607131800000001 podStartE2EDuration="6.211235776s" podCreationTimestamp="2026-04-21 04:31:57 +0000 UTC" firstStartedPulling="2026-04-21 04:31:57.61778211 +0000 UTC m=+496.477886084" lastFinishedPulling="2026-04-21 04:32:02.168304722 +0000 UTC m=+501.028408680" observedRunningTime="2026-04-21 04:32:03.209268173 +0000 UTC m=+502.069372164" watchObservedRunningTime="2026-04-21 04:32:03.211235776 +0000 UTC m=+502.071339755" Apr 21 04:32:03.481979 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:03.481884 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:32:03.486447 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:03.486423 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:32:04.190686 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:04.190657 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:32:04.191788 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:04.191771 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6" Apr 21 04:32:25.984009 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:25.983971 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-4h9dt"] Apr 21 04:32:25.988646 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:25.988624 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-4h9dt" Apr 21 04:32:25.990847 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:25.990826 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 04:32:25.990954 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:25.990853 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-j8tx5\"" Apr 21 04:32:25.991619 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:25.991603 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 04:32:25.995525 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:25.995501 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-4h9dt"] Apr 21 04:32:26.063359 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:26.063319 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2slf5\" (UniqueName: \"kubernetes.io/projected/1c7c99c5-823b-43d9-ba83-eb0c50c5713c-kube-api-access-2slf5\") pod \"kuadrant-operator-catalog-4h9dt\" (UID: \"1c7c99c5-823b-43d9-ba83-eb0c50c5713c\") " pod="kuadrant-system/kuadrant-operator-catalog-4h9dt" Apr 21 04:32:26.164432 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:26.164390 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2slf5\" (UniqueName: \"kubernetes.io/projected/1c7c99c5-823b-43d9-ba83-eb0c50c5713c-kube-api-access-2slf5\") pod \"kuadrant-operator-catalog-4h9dt\" (UID: \"1c7c99c5-823b-43d9-ba83-eb0c50c5713c\") " pod="kuadrant-system/kuadrant-operator-catalog-4h9dt" Apr 21 04:32:26.171627 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:26.171607 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2slf5\" (UniqueName: \"kubernetes.io/projected/1c7c99c5-823b-43d9-ba83-eb0c50c5713c-kube-api-access-2slf5\") pod \"kuadrant-operator-catalog-4h9dt\" (UID: \"1c7c99c5-823b-43d9-ba83-eb0c50c5713c\") " pod="kuadrant-system/kuadrant-operator-catalog-4h9dt" Apr 21 04:32:26.299461 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:26.299351 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-4h9dt" Apr 21 04:32:26.342314 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:26.342278 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-4h9dt"] Apr 21 04:32:26.416200 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:26.416179 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-4h9dt"] Apr 21 04:32:26.418244 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:32:26.418215 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c7c99c5_823b_43d9_ba83_eb0c50c5713c.slice/crio-ac1d3e0644a09a3d79aaa9f0d0d5cf29fd6e4af7877781f7042705263897b1aa WatchSource:0}: Error finding container ac1d3e0644a09a3d79aaa9f0d0d5cf29fd6e4af7877781f7042705263897b1aa: Status 404 returned error can't find the container with id ac1d3e0644a09a3d79aaa9f0d0d5cf29fd6e4af7877781f7042705263897b1aa Apr 21 04:32:26.551226 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:26.551135 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wpbq6"] Apr 21 04:32:26.556009 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:26.555985 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-wpbq6" Apr 21 04:32:26.560378 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:26.560344 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wpbq6"] Apr 21 04:32:26.668705 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:26.668660 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52hc6\" (UniqueName: \"kubernetes.io/projected/48a4cb8f-9ced-461e-a1fa-779f856430a9-kube-api-access-52hc6\") pod \"kuadrant-operator-catalog-wpbq6\" (UID: \"48a4cb8f-9ced-461e-a1fa-779f856430a9\") " pod="kuadrant-system/kuadrant-operator-catalog-wpbq6" Apr 21 04:32:26.769551 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:26.769513 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52hc6\" (UniqueName: \"kubernetes.io/projected/48a4cb8f-9ced-461e-a1fa-779f856430a9-kube-api-access-52hc6\") pod \"kuadrant-operator-catalog-wpbq6\" (UID: \"48a4cb8f-9ced-461e-a1fa-779f856430a9\") " pod="kuadrant-system/kuadrant-operator-catalog-wpbq6" Apr 21 04:32:26.776556 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:26.776536 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52hc6\" (UniqueName: \"kubernetes.io/projected/48a4cb8f-9ced-461e-a1fa-779f856430a9-kube-api-access-52hc6\") pod \"kuadrant-operator-catalog-wpbq6\" (UID: \"48a4cb8f-9ced-461e-a1fa-779f856430a9\") " pod="kuadrant-system/kuadrant-operator-catalog-wpbq6" Apr 21 04:32:26.866644 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:26.866608 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-wpbq6" Apr 21 04:32:26.982635 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:26.982605 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wpbq6"] Apr 21 04:32:26.984328 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:32:26.984188 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48a4cb8f_9ced_461e_a1fa_779f856430a9.slice/crio-7611820511a1caa3eb9ef71b3f38c91e7593283ec2fbec0ba409a650280f8a5d WatchSource:0}: Error finding container 7611820511a1caa3eb9ef71b3f38c91e7593283ec2fbec0ba409a650280f8a5d: Status 404 returned error can't find the container with id 7611820511a1caa3eb9ef71b3f38c91e7593283ec2fbec0ba409a650280f8a5d Apr 21 04:32:27.267647 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:27.267553 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-wpbq6" event={"ID":"48a4cb8f-9ced-461e-a1fa-779f856430a9","Type":"ContainerStarted","Data":"7611820511a1caa3eb9ef71b3f38c91e7593283ec2fbec0ba409a650280f8a5d"} Apr 21 04:32:27.268752 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:27.268702 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-4h9dt" event={"ID":"1c7c99c5-823b-43d9-ba83-eb0c50c5713c","Type":"ContainerStarted","Data":"ac1d3e0644a09a3d79aaa9f0d0d5cf29fd6e4af7877781f7042705263897b1aa"} Apr 21 04:32:29.277393 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:29.277355 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-wpbq6" event={"ID":"48a4cb8f-9ced-461e-a1fa-779f856430a9","Type":"ContainerStarted","Data":"96c1e7b9f049a9244a345176169243943135063b3b9c628167f52109ce765dbc"} Apr 21 04:32:29.278638 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:29.278614 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-4h9dt" event={"ID":"1c7c99c5-823b-43d9-ba83-eb0c50c5713c","Type":"ContainerStarted","Data":"7e4397f66e0b6500b772f7efab052504a48dbdb0ad87fad312a80ea6f474ac36"} Apr 21 04:32:29.278761 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:29.278700 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-4h9dt" podUID="1c7c99c5-823b-43d9-ba83-eb0c50c5713c" containerName="registry-server" containerID="cri-o://7e4397f66e0b6500b772f7efab052504a48dbdb0ad87fad312a80ea6f474ac36" gracePeriod=2 Apr 21 04:32:29.294696 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:29.294645 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-wpbq6" podStartSLOduration=1.605548867 podStartE2EDuration="3.294629763s" podCreationTimestamp="2026-04-21 04:32:26 +0000 UTC" firstStartedPulling="2026-04-21 04:32:26.985454324 +0000 UTC m=+525.845558281" lastFinishedPulling="2026-04-21 04:32:28.67453522 +0000 UTC m=+527.534639177" observedRunningTime="2026-04-21 04:32:29.290804511 +0000 UTC m=+528.150908490" watchObservedRunningTime="2026-04-21 04:32:29.294629763 +0000 UTC m=+528.154733748" Apr 21 04:32:29.306237 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:29.306190 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-4h9dt" podStartSLOduration=2.05377613 podStartE2EDuration="4.306174414s" podCreationTimestamp="2026-04-21 04:32:25 +0000 UTC" firstStartedPulling="2026-04-21 04:32:26.419369949 +0000 UTC m=+525.279473906" lastFinishedPulling="2026-04-21 04:32:28.671768234 +0000 UTC m=+527.531872190" observedRunningTime="2026-04-21 04:32:29.305693852 +0000 UTC m=+528.165797833" watchObservedRunningTime="2026-04-21 04:32:29.306174414 +0000 UTC m=+528.166278397" Apr 21 04:32:29.509155 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:29.509133 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-4h9dt" Apr 21 04:32:29.593255 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:29.593228 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2slf5\" (UniqueName: \"kubernetes.io/projected/1c7c99c5-823b-43d9-ba83-eb0c50c5713c-kube-api-access-2slf5\") pod \"1c7c99c5-823b-43d9-ba83-eb0c50c5713c\" (UID: \"1c7c99c5-823b-43d9-ba83-eb0c50c5713c\") " Apr 21 04:32:29.595386 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:29.595357 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c7c99c5-823b-43d9-ba83-eb0c50c5713c-kube-api-access-2slf5" (OuterVolumeSpecName: "kube-api-access-2slf5") pod "1c7c99c5-823b-43d9-ba83-eb0c50c5713c" (UID: "1c7c99c5-823b-43d9-ba83-eb0c50c5713c"). InnerVolumeSpecName "kube-api-access-2slf5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:32:29.693913 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:29.693886 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2slf5\" (UniqueName: \"kubernetes.io/projected/1c7c99c5-823b-43d9-ba83-eb0c50c5713c-kube-api-access-2slf5\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:32:30.282514 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:30.282477 2574 generic.go:358] "Generic (PLEG): container finished" podID="1c7c99c5-823b-43d9-ba83-eb0c50c5713c" containerID="7e4397f66e0b6500b772f7efab052504a48dbdb0ad87fad312a80ea6f474ac36" exitCode=0 Apr 21 04:32:30.283013 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:30.282544 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-4h9dt" Apr 21 04:32:30.283013 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:30.282568 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-4h9dt" event={"ID":"1c7c99c5-823b-43d9-ba83-eb0c50c5713c","Type":"ContainerDied","Data":"7e4397f66e0b6500b772f7efab052504a48dbdb0ad87fad312a80ea6f474ac36"} Apr 21 04:32:30.283013 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:30.282602 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-4h9dt" event={"ID":"1c7c99c5-823b-43d9-ba83-eb0c50c5713c","Type":"ContainerDied","Data":"ac1d3e0644a09a3d79aaa9f0d0d5cf29fd6e4af7877781f7042705263897b1aa"} Apr 21 04:32:30.283013 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:30.282615 2574 scope.go:117] "RemoveContainer" containerID="7e4397f66e0b6500b772f7efab052504a48dbdb0ad87fad312a80ea6f474ac36" Apr 21 04:32:30.293429 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:30.293410 2574 scope.go:117] "RemoveContainer" containerID="7e4397f66e0b6500b772f7efab052504a48dbdb0ad87fad312a80ea6f474ac36" Apr 21 04:32:30.293681 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:32:30.293659 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e4397f66e0b6500b772f7efab052504a48dbdb0ad87fad312a80ea6f474ac36\": container with ID starting with 7e4397f66e0b6500b772f7efab052504a48dbdb0ad87fad312a80ea6f474ac36 not found: ID does not exist" containerID="7e4397f66e0b6500b772f7efab052504a48dbdb0ad87fad312a80ea6f474ac36" Apr 21 04:32:30.293717 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:30.293690 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e4397f66e0b6500b772f7efab052504a48dbdb0ad87fad312a80ea6f474ac36"} err="failed to get container status \"7e4397f66e0b6500b772f7efab052504a48dbdb0ad87fad312a80ea6f474ac36\": rpc error: code = NotFound desc = could not find container \"7e4397f66e0b6500b772f7efab052504a48dbdb0ad87fad312a80ea6f474ac36\": container with ID starting with 7e4397f66e0b6500b772f7efab052504a48dbdb0ad87fad312a80ea6f474ac36 not found: ID does not exist" Apr 21 04:32:30.297802 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:30.297776 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-4h9dt"] Apr 21 04:32:30.302994 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:30.302970 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-4h9dt"] Apr 21 04:32:31.688081 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:31.688051 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c7c99c5-823b-43d9-ba83-eb0c50c5713c" path="/var/lib/kubelet/pods/1c7c99c5-823b-43d9-ba83-eb0c50c5713c/volumes" Apr 21 04:32:36.867253 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:36.867210 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-wpbq6" Apr 21 04:32:36.867253 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:36.867263 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-wpbq6" Apr 21 04:32:36.889320 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:36.889285 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-wpbq6" Apr 21 04:32:37.326361 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:37.326295 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-wpbq6" Apr 21 04:32:41.632919 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:41.632894 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf"] Apr 21 04:32:41.633245 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:41.633174 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c7c99c5-823b-43d9-ba83-eb0c50c5713c" containerName="registry-server" Apr 21 04:32:41.633245 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:41.633184 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7c99c5-823b-43d9-ba83-eb0c50c5713c" containerName="registry-server" Apr 21 04:32:41.633245 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:41.633227 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c7c99c5-823b-43d9-ba83-eb0c50c5713c" containerName="registry-server" Apr 21 04:32:41.641597 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:41.641577 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" Apr 21 04:32:41.644228 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:41.644205 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-cbpgz\"" Apr 21 04:32:41.654779 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:41.654753 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf"] Apr 21 04:32:41.793116 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:41.793074 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be30dc22-b0e8-468c-92aa-a37020853240-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf\" (UID: \"be30dc22-b0e8-468c-92aa-a37020853240\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" Apr 21 04:32:41.793322 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:41.793132 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be30dc22-b0e8-468c-92aa-a37020853240-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf\" (UID: \"be30dc22-b0e8-468c-92aa-a37020853240\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" Apr 21 04:32:41.793322 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:41.793197 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76brt\" (UniqueName: \"kubernetes.io/projected/be30dc22-b0e8-468c-92aa-a37020853240-kube-api-access-76brt\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf\" (UID: \"be30dc22-b0e8-468c-92aa-a37020853240\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" Apr 21 04:32:41.894661 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:41.894572 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be30dc22-b0e8-468c-92aa-a37020853240-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf\" (UID: \"be30dc22-b0e8-468c-92aa-a37020853240\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" Apr 21 04:32:41.894661 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:41.894609 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76brt\" (UniqueName: \"kubernetes.io/projected/be30dc22-b0e8-468c-92aa-a37020853240-kube-api-access-76brt\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf\" (UID: \"be30dc22-b0e8-468c-92aa-a37020853240\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" Apr 21 04:32:41.894661 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:41.894660 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be30dc22-b0e8-468c-92aa-a37020853240-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf\" (UID: \"be30dc22-b0e8-468c-92aa-a37020853240\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" Apr 21 04:32:41.895042 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:41.895027 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be30dc22-b0e8-468c-92aa-a37020853240-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf\" (UID: \"be30dc22-b0e8-468c-92aa-a37020853240\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" Apr 21 04:32:41.895083 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:41.895023 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be30dc22-b0e8-468c-92aa-a37020853240-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf\" (UID: \"be30dc22-b0e8-468c-92aa-a37020853240\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" Apr 21 04:32:41.905399 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:41.905371 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76brt\" (UniqueName: \"kubernetes.io/projected/be30dc22-b0e8-468c-92aa-a37020853240-kube-api-access-76brt\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf\" (UID: \"be30dc22-b0e8-468c-92aa-a37020853240\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" Apr 21 04:32:41.950931 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:41.950902 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" Apr 21 04:32:42.072712 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.072663 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf"] Apr 21 04:32:42.074497 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:32:42.074451 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe30dc22_b0e8_468c_92aa_a37020853240.slice/crio-0fd858c362a283c9d3b3fccc37b2d7c098f09d1ed6c61950981b34792af0ff6d WatchSource:0}: Error finding container 0fd858c362a283c9d3b3fccc37b2d7c098f09d1ed6c61950981b34792af0ff6d: Status 404 returned error can't find the container with id 0fd858c362a283c9d3b3fccc37b2d7c098f09d1ed6c61950981b34792af0ff6d Apr 21 04:32:42.193049 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.193021 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs"] Apr 21 04:32:42.196345 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.196325 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" Apr 21 04:32:42.203033 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.203009 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs"] Apr 21 04:32:42.297380 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.297347 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73b4592f-9384-4b6f-a122-ad4317c5ffb0-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs\" (UID: \"73b4592f-9384-4b6f-a122-ad4317c5ffb0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" Apr 21 04:32:42.297568 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.297386 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm4kp\" (UniqueName: \"kubernetes.io/projected/73b4592f-9384-4b6f-a122-ad4317c5ffb0-kube-api-access-xm4kp\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs\" (UID: \"73b4592f-9384-4b6f-a122-ad4317c5ffb0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" Apr 21 04:32:42.297568 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.297474 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73b4592f-9384-4b6f-a122-ad4317c5ffb0-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs\" (UID: \"73b4592f-9384-4b6f-a122-ad4317c5ffb0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" Apr 21 04:32:42.322095 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.322058 2574 generic.go:358] "Generic (PLEG): container finished" podID="be30dc22-b0e8-468c-92aa-a37020853240" containerID="226e339aa6939c844aec6238bc7621ab8da59bfd44de28f06341fa053e593806" exitCode=0 Apr 21 04:32:42.322279 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.322127 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" event={"ID":"be30dc22-b0e8-468c-92aa-a37020853240","Type":"ContainerDied","Data":"226e339aa6939c844aec6238bc7621ab8da59bfd44de28f06341fa053e593806"} Apr 21 04:32:42.322279 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.322149 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" event={"ID":"be30dc22-b0e8-468c-92aa-a37020853240","Type":"ContainerStarted","Data":"0fd858c362a283c9d3b3fccc37b2d7c098f09d1ed6c61950981b34792af0ff6d"} Apr 21 04:32:42.398589 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.398550 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73b4592f-9384-4b6f-a122-ad4317c5ffb0-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs\" (UID: \"73b4592f-9384-4b6f-a122-ad4317c5ffb0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" Apr 21 04:32:42.398820 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.398612 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73b4592f-9384-4b6f-a122-ad4317c5ffb0-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs\" (UID: \"73b4592f-9384-4b6f-a122-ad4317c5ffb0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" Apr 21 04:32:42.398820 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.398750 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xm4kp\" (UniqueName: \"kubernetes.io/projected/73b4592f-9384-4b6f-a122-ad4317c5ffb0-kube-api-access-xm4kp\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs\" (UID: \"73b4592f-9384-4b6f-a122-ad4317c5ffb0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" Apr 21 04:32:42.398978 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.398960 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73b4592f-9384-4b6f-a122-ad4317c5ffb0-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs\" (UID: \"73b4592f-9384-4b6f-a122-ad4317c5ffb0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" Apr 21 04:32:42.399056 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.399034 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73b4592f-9384-4b6f-a122-ad4317c5ffb0-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs\" (UID: \"73b4592f-9384-4b6f-a122-ad4317c5ffb0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" Apr 21 04:32:42.407517 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.407493 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm4kp\" (UniqueName: \"kubernetes.io/projected/73b4592f-9384-4b6f-a122-ad4317c5ffb0-kube-api-access-xm4kp\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs\" (UID: \"73b4592f-9384-4b6f-a122-ad4317c5ffb0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" Apr 21 04:32:42.556914 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.556809 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" Apr 21 04:32:42.674911 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.674886 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs"] Apr 21 04:32:42.676630 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:32:42.676604 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73b4592f_9384_4b6f_a122_ad4317c5ffb0.slice/crio-dbe87aef07a7e06317933809045252eda252558e4c70accf826d58f6f160a5c2 WatchSource:0}: Error finding container dbe87aef07a7e06317933809045252eda252558e4c70accf826d58f6f160a5c2: Status 404 returned error can't find the container with id dbe87aef07a7e06317933809045252eda252558e4c70accf826d58f6f160a5c2 Apr 21 04:32:42.866864 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.866832 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml"] Apr 21 04:32:42.870134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.870118 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" Apr 21 04:32:42.879470 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:42.879446 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml"] Apr 21 04:32:43.002760 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.002704 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28f6b4ba-638e-46c3-9357-9840f2c1faee-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml\" (UID: \"28f6b4ba-638e-46c3-9357-9840f2c1faee\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" Apr 21 04:32:43.002918 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.002881 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlthz\" (UniqueName: \"kubernetes.io/projected/28f6b4ba-638e-46c3-9357-9840f2c1faee-kube-api-access-nlthz\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml\" (UID: \"28f6b4ba-638e-46c3-9357-9840f2c1faee\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" Apr 21 04:32:43.002968 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.002940 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28f6b4ba-638e-46c3-9357-9840f2c1faee-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml\" (UID: \"28f6b4ba-638e-46c3-9357-9840f2c1faee\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" Apr 21 04:32:43.103431 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.103396 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlthz\" (UniqueName: \"kubernetes.io/projected/28f6b4ba-638e-46c3-9357-9840f2c1faee-kube-api-access-nlthz\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml\" (UID: \"28f6b4ba-638e-46c3-9357-9840f2c1faee\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" Apr 21 04:32:43.103565 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.103534 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28f6b4ba-638e-46c3-9357-9840f2c1faee-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml\" (UID: \"28f6b4ba-638e-46c3-9357-9840f2c1faee\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" Apr 21 04:32:43.103622 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.103570 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28f6b4ba-638e-46c3-9357-9840f2c1faee-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml\" (UID: \"28f6b4ba-638e-46c3-9357-9840f2c1faee\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" Apr 21 04:32:43.103925 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.103909 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28f6b4ba-638e-46c3-9357-9840f2c1faee-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml\" (UID: \"28f6b4ba-638e-46c3-9357-9840f2c1faee\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" Apr 21 04:32:43.103997 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.103924 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28f6b4ba-638e-46c3-9357-9840f2c1faee-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml\" (UID: \"28f6b4ba-638e-46c3-9357-9840f2c1faee\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" Apr 21 04:32:43.111353 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.111333 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlthz\" (UniqueName: \"kubernetes.io/projected/28f6b4ba-638e-46c3-9357-9840f2c1faee-kube-api-access-nlthz\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml\" (UID: \"28f6b4ba-638e-46c3-9357-9840f2c1faee\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" Apr 21 04:32:43.178524 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.178498 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" Apr 21 04:32:43.190383 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.190355 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4"] Apr 21 04:32:43.193761 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.193744 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" Apr 21 04:32:43.200469 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.200445 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4"] Apr 21 04:32:43.299690 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.299659 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml"] Apr 21 04:32:43.301108 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:32:43.301083 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28f6b4ba_638e_46c3_9357_9840f2c1faee.slice/crio-b8b7c1fccf69f352cfb0352163a98b4ab829a22f18f4d10da6f8564c82c6b27e WatchSource:0}: Error finding container b8b7c1fccf69f352cfb0352163a98b4ab829a22f18f4d10da6f8564c82c6b27e: Status 404 returned error can't find the container with id b8b7c1fccf69f352cfb0352163a98b4ab829a22f18f4d10da6f8564c82c6b27e Apr 21 04:32:43.305087 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.305066 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xwxr\" (UniqueName: \"kubernetes.io/projected/a6ec1d15-a3d0-4783-811b-3f22bdfba652-kube-api-access-7xwxr\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4\" (UID: \"a6ec1d15-a3d0-4783-811b-3f22bdfba652\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" Apr 21 04:32:43.305155 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.305104 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6ec1d15-a3d0-4783-811b-3f22bdfba652-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4\" (UID: \"a6ec1d15-a3d0-4783-811b-3f22bdfba652\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" Apr 21 04:32:43.305221 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.305164 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6ec1d15-a3d0-4783-811b-3f22bdfba652-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4\" (UID: \"a6ec1d15-a3d0-4783-811b-3f22bdfba652\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" Apr 21 04:32:43.326861 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.326832 2574 generic.go:358] "Generic (PLEG): container finished" podID="73b4592f-9384-4b6f-a122-ad4317c5ffb0" containerID="4ceff60f22d9a038294cc7e3ad5608db4ef90c0e36563bb659bc91a435da146d" exitCode=0 Apr 21 04:32:43.326989 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.326914 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" event={"ID":"73b4592f-9384-4b6f-a122-ad4317c5ffb0","Type":"ContainerDied","Data":"4ceff60f22d9a038294cc7e3ad5608db4ef90c0e36563bb659bc91a435da146d"} Apr 21 04:32:43.326989 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.326950 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" event={"ID":"73b4592f-9384-4b6f-a122-ad4317c5ffb0","Type":"ContainerStarted","Data":"dbe87aef07a7e06317933809045252eda252558e4c70accf826d58f6f160a5c2"} Apr 21 04:32:43.328002 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.327973 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" event={"ID":"28f6b4ba-638e-46c3-9357-9840f2c1faee","Type":"ContainerStarted","Data":"b8b7c1fccf69f352cfb0352163a98b4ab829a22f18f4d10da6f8564c82c6b27e"} Apr 21 04:32:43.329572 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.329551 2574 generic.go:358] "Generic (PLEG): container finished" podID="be30dc22-b0e8-468c-92aa-a37020853240" containerID="c7d25fb5b6b53a303eeb482d08120a27821c0315e708831ec8fede057e87b720" exitCode=0 Apr 21 04:32:43.329638 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.329606 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" event={"ID":"be30dc22-b0e8-468c-92aa-a37020853240","Type":"ContainerDied","Data":"c7d25fb5b6b53a303eeb482d08120a27821c0315e708831ec8fede057e87b720"} Apr 21 04:32:43.406037 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.406003 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6ec1d15-a3d0-4783-811b-3f22bdfba652-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4\" (UID: \"a6ec1d15-a3d0-4783-811b-3f22bdfba652\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" Apr 21 04:32:43.406228 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.406085 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6ec1d15-a3d0-4783-811b-3f22bdfba652-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4\" (UID: \"a6ec1d15-a3d0-4783-811b-3f22bdfba652\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" Apr 21 04:32:43.406228 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.406141 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xwxr\" (UniqueName: \"kubernetes.io/projected/a6ec1d15-a3d0-4783-811b-3f22bdfba652-kube-api-access-7xwxr\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4\" (UID: \"a6ec1d15-a3d0-4783-811b-3f22bdfba652\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" Apr 21 04:32:43.406505 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.406476 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6ec1d15-a3d0-4783-811b-3f22bdfba652-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4\" (UID: \"a6ec1d15-a3d0-4783-811b-3f22bdfba652\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" Apr 21 04:32:43.406593 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.406493 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6ec1d15-a3d0-4783-811b-3f22bdfba652-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4\" (UID: \"a6ec1d15-a3d0-4783-811b-3f22bdfba652\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" Apr 21 04:32:43.415344 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.415318 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xwxr\" (UniqueName: \"kubernetes.io/projected/a6ec1d15-a3d0-4783-811b-3f22bdfba652-kube-api-access-7xwxr\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4\" (UID: \"a6ec1d15-a3d0-4783-811b-3f22bdfba652\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" Apr 21 04:32:43.532892 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.532857 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" Apr 21 04:32:43.646260 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:43.646232 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4"] Apr 21 04:32:43.648498 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:32:43.648468 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6ec1d15_a3d0_4783_811b_3f22bdfba652.slice/crio-efa3134b07c1e27565bf500b2e7fce406a527ab8ea8ae6b5636b82868cff5534 WatchSource:0}: Error finding container efa3134b07c1e27565bf500b2e7fce406a527ab8ea8ae6b5636b82868cff5534: Status 404 returned error can't find the container with id efa3134b07c1e27565bf500b2e7fce406a527ab8ea8ae6b5636b82868cff5534 Apr 21 04:32:44.334669 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:44.334639 2574 generic.go:358] "Generic (PLEG): container finished" podID="73b4592f-9384-4b6f-a122-ad4317c5ffb0" containerID="6f2d6380b4b994057f1a7eef948ee2c7e64f29b1eb943e497a4b07b0f75a051d" exitCode=0 Apr 21 04:32:44.335091 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:44.334715 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" event={"ID":"73b4592f-9384-4b6f-a122-ad4317c5ffb0","Type":"ContainerDied","Data":"6f2d6380b4b994057f1a7eef948ee2c7e64f29b1eb943e497a4b07b0f75a051d"} Apr 21 04:32:44.336101 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:44.336079 2574 generic.go:358] "Generic (PLEG): container finished" podID="28f6b4ba-638e-46c3-9357-9840f2c1faee" containerID="b822a8202acc7370c0068820695d88905c243f91ec0206f9ce453fa935f065e9" exitCode=0 Apr 21 04:32:44.336216 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:44.336166 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" event={"ID":"28f6b4ba-638e-46c3-9357-9840f2c1faee","Type":"ContainerDied","Data":"b822a8202acc7370c0068820695d88905c243f91ec0206f9ce453fa935f065e9"} Apr 21 04:32:44.337701 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:44.337676 2574 generic.go:358] "Generic (PLEG): container finished" podID="a6ec1d15-a3d0-4783-811b-3f22bdfba652" containerID="3aabe4f492b44c0d98fb07d6b525e5fb8659394b4dfc88cbb1ae9ce458e0eece" exitCode=0 Apr 21 04:32:44.337799 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:44.337716 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" event={"ID":"a6ec1d15-a3d0-4783-811b-3f22bdfba652","Type":"ContainerDied","Data":"3aabe4f492b44c0d98fb07d6b525e5fb8659394b4dfc88cbb1ae9ce458e0eece"} Apr 21 04:32:44.337799 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:44.337757 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" event={"ID":"a6ec1d15-a3d0-4783-811b-3f22bdfba652","Type":"ContainerStarted","Data":"efa3134b07c1e27565bf500b2e7fce406a527ab8ea8ae6b5636b82868cff5534"} Apr 21 04:32:44.339870 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:44.339814 2574 generic.go:358] "Generic (PLEG): container finished" podID="be30dc22-b0e8-468c-92aa-a37020853240" containerID="18c91cd06978f74edbb64bbd2ead46563d7f9d969f4c1ecec587f2bb567d1c82" exitCode=0 Apr 21 04:32:44.339939 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:44.339887 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" event={"ID":"be30dc22-b0e8-468c-92aa-a37020853240","Type":"ContainerDied","Data":"18c91cd06978f74edbb64bbd2ead46563d7f9d969f4c1ecec587f2bb567d1c82"} Apr 21 04:32:45.350413 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:45.350379 2574 generic.go:358] "Generic (PLEG): container finished" podID="73b4592f-9384-4b6f-a122-ad4317c5ffb0" containerID="1c644fc4c1d4f946eba155fc03943dc6365d394509833551df650ba431894831" exitCode=0 Apr 21 04:32:45.350862 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:45.350462 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" event={"ID":"73b4592f-9384-4b6f-a122-ad4317c5ffb0","Type":"ContainerDied","Data":"1c644fc4c1d4f946eba155fc03943dc6365d394509833551df650ba431894831"} Apr 21 04:32:45.352210 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:45.352181 2574 generic.go:358] "Generic (PLEG): container finished" podID="28f6b4ba-638e-46c3-9357-9840f2c1faee" containerID="00a75af06eca88599e8153c7e3f5676cd851839e63f8a403d865f0038c2e0fb4" exitCode=0 Apr 21 04:32:45.352329 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:45.352273 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" event={"ID":"28f6b4ba-638e-46c3-9357-9840f2c1faee","Type":"ContainerDied","Data":"00a75af06eca88599e8153c7e3f5676cd851839e63f8a403d865f0038c2e0fb4"} Apr 21 04:32:45.354053 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:45.354000 2574 generic.go:358] "Generic (PLEG): container finished" podID="a6ec1d15-a3d0-4783-811b-3f22bdfba652" containerID="68c3b474f99e1ab9c10a3fa06b432ee2bde6c35739ddfcaaf276469ee769c360" exitCode=0 Apr 21 04:32:45.354125 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:45.354083 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" event={"ID":"a6ec1d15-a3d0-4783-811b-3f22bdfba652","Type":"ContainerDied","Data":"68c3b474f99e1ab9c10a3fa06b432ee2bde6c35739ddfcaaf276469ee769c360"} Apr 21 04:32:45.478129 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:45.478107 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" Apr 21 04:32:45.622204 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:45.622173 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be30dc22-b0e8-468c-92aa-a37020853240-bundle\") pod \"be30dc22-b0e8-468c-92aa-a37020853240\" (UID: \"be30dc22-b0e8-468c-92aa-a37020853240\") " Apr 21 04:32:45.622346 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:45.622262 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be30dc22-b0e8-468c-92aa-a37020853240-util\") pod \"be30dc22-b0e8-468c-92aa-a37020853240\" (UID: \"be30dc22-b0e8-468c-92aa-a37020853240\") " Apr 21 04:32:45.622346 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:45.622285 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76brt\" (UniqueName: \"kubernetes.io/projected/be30dc22-b0e8-468c-92aa-a37020853240-kube-api-access-76brt\") pod \"be30dc22-b0e8-468c-92aa-a37020853240\" (UID: \"be30dc22-b0e8-468c-92aa-a37020853240\") " Apr 21 04:32:45.622675 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:45.622648 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be30dc22-b0e8-468c-92aa-a37020853240-bundle" (OuterVolumeSpecName: "bundle") pod "be30dc22-b0e8-468c-92aa-a37020853240" (UID: "be30dc22-b0e8-468c-92aa-a37020853240"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:32:45.624353 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:45.624331 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be30dc22-b0e8-468c-92aa-a37020853240-kube-api-access-76brt" (OuterVolumeSpecName: "kube-api-access-76brt") pod "be30dc22-b0e8-468c-92aa-a37020853240" (UID: "be30dc22-b0e8-468c-92aa-a37020853240"). InnerVolumeSpecName "kube-api-access-76brt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:32:45.627348 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:45.627326 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be30dc22-b0e8-468c-92aa-a37020853240-util" (OuterVolumeSpecName: "util") pod "be30dc22-b0e8-468c-92aa-a37020853240" (UID: "be30dc22-b0e8-468c-92aa-a37020853240"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:32:45.723661 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:45.723626 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be30dc22-b0e8-468c-92aa-a37020853240-util\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:32:45.723661 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:45.723655 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-76brt\" (UniqueName: \"kubernetes.io/projected/be30dc22-b0e8-468c-92aa-a37020853240-kube-api-access-76brt\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:32:45.723861 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:45.723677 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be30dc22-b0e8-468c-92aa-a37020853240-bundle\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:32:46.359211 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:46.359179 2574 generic.go:358] "Generic (PLEG): container finished" podID="a6ec1d15-a3d0-4783-811b-3f22bdfba652" containerID="a7df7e137c607dc204df887db4f4e2e05841db252c90b5e257d909f2b7a7c46a" exitCode=0 Apr 21 04:32:46.359619 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:46.359247 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" event={"ID":"a6ec1d15-a3d0-4783-811b-3f22bdfba652","Type":"ContainerDied","Data":"a7df7e137c607dc204df887db4f4e2e05841db252c90b5e257d909f2b7a7c46a"} Apr 21 04:32:46.360936 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:46.360918 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" Apr 21 04:32:46.361057 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:46.360930 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf" event={"ID":"be30dc22-b0e8-468c-92aa-a37020853240","Type":"ContainerDied","Data":"0fd858c362a283c9d3b3fccc37b2d7c098f09d1ed6c61950981b34792af0ff6d"} Apr 21 04:32:46.361057 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:46.360961 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fd858c362a283c9d3b3fccc37b2d7c098f09d1ed6c61950981b34792af0ff6d" Apr 21 04:32:46.363088 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:46.362966 2574 generic.go:358] "Generic (PLEG): container finished" podID="28f6b4ba-638e-46c3-9357-9840f2c1faee" containerID="d2bbdc5ab15a3f6be07b8fcf631e41cf8f56d28d15d0b4f049537d94cb408dbb" exitCode=0 Apr 21 04:32:46.363088 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:46.363014 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" event={"ID":"28f6b4ba-638e-46c3-9357-9840f2c1faee","Type":"ContainerDied","Data":"d2bbdc5ab15a3f6be07b8fcf631e41cf8f56d28d15d0b4f049537d94cb408dbb"} Apr 21 04:32:46.488594 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:46.488568 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" Apr 21 04:32:46.631423 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:46.631388 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73b4592f-9384-4b6f-a122-ad4317c5ffb0-util\") pod \"73b4592f-9384-4b6f-a122-ad4317c5ffb0\" (UID: \"73b4592f-9384-4b6f-a122-ad4317c5ffb0\") " Apr 21 04:32:46.631632 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:46.631456 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73b4592f-9384-4b6f-a122-ad4317c5ffb0-bundle\") pod \"73b4592f-9384-4b6f-a122-ad4317c5ffb0\" (UID: \"73b4592f-9384-4b6f-a122-ad4317c5ffb0\") " Apr 21 04:32:46.631632 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:46.631484 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm4kp\" (UniqueName: \"kubernetes.io/projected/73b4592f-9384-4b6f-a122-ad4317c5ffb0-kube-api-access-xm4kp\") pod \"73b4592f-9384-4b6f-a122-ad4317c5ffb0\" (UID: \"73b4592f-9384-4b6f-a122-ad4317c5ffb0\") " Apr 21 04:32:46.632101 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:46.632077 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73b4592f-9384-4b6f-a122-ad4317c5ffb0-bundle" (OuterVolumeSpecName: "bundle") pod "73b4592f-9384-4b6f-a122-ad4317c5ffb0" (UID: "73b4592f-9384-4b6f-a122-ad4317c5ffb0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:32:46.633531 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:46.633509 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73b4592f-9384-4b6f-a122-ad4317c5ffb0-kube-api-access-xm4kp" (OuterVolumeSpecName: "kube-api-access-xm4kp") pod "73b4592f-9384-4b6f-a122-ad4317c5ffb0" (UID: "73b4592f-9384-4b6f-a122-ad4317c5ffb0"). InnerVolumeSpecName "kube-api-access-xm4kp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:32:46.636198 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:46.636161 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73b4592f-9384-4b6f-a122-ad4317c5ffb0-util" (OuterVolumeSpecName: "util") pod "73b4592f-9384-4b6f-a122-ad4317c5ffb0" (UID: "73b4592f-9384-4b6f-a122-ad4317c5ffb0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:32:46.731994 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:46.731918 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73b4592f-9384-4b6f-a122-ad4317c5ffb0-util\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:32:46.731994 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:46.731941 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73b4592f-9384-4b6f-a122-ad4317c5ffb0-bundle\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:32:46.731994 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:46.731951 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xm4kp\" (UniqueName: \"kubernetes.io/projected/73b4592f-9384-4b6f-a122-ad4317c5ffb0-kube-api-access-xm4kp\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:32:47.368506 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.368479 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" Apr 21 04:32:47.368506 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.368494 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs" event={"ID":"73b4592f-9384-4b6f-a122-ad4317c5ffb0","Type":"ContainerDied","Data":"dbe87aef07a7e06317933809045252eda252558e4c70accf826d58f6f160a5c2"} Apr 21 04:32:47.369058 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.368531 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbe87aef07a7e06317933809045252eda252558e4c70accf826d58f6f160a5c2" Apr 21 04:32:47.499073 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.499054 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" Apr 21 04:32:47.524828 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.524807 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" Apr 21 04:32:47.639571 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.639488 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlthz\" (UniqueName: \"kubernetes.io/projected/28f6b4ba-638e-46c3-9357-9840f2c1faee-kube-api-access-nlthz\") pod \"28f6b4ba-638e-46c3-9357-9840f2c1faee\" (UID: \"28f6b4ba-638e-46c3-9357-9840f2c1faee\") " Apr 21 04:32:47.639571 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.639530 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28f6b4ba-638e-46c3-9357-9840f2c1faee-util\") pod \"28f6b4ba-638e-46c3-9357-9840f2c1faee\" (UID: \"28f6b4ba-638e-46c3-9357-9840f2c1faee\") " Apr 21 04:32:47.639571 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.639576 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6ec1d15-a3d0-4783-811b-3f22bdfba652-util\") pod \"a6ec1d15-a3d0-4783-811b-3f22bdfba652\" (UID: \"a6ec1d15-a3d0-4783-811b-3f22bdfba652\") " Apr 21 04:32:47.639918 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.639599 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6ec1d15-a3d0-4783-811b-3f22bdfba652-bundle\") pod \"a6ec1d15-a3d0-4783-811b-3f22bdfba652\" (UID: \"a6ec1d15-a3d0-4783-811b-3f22bdfba652\") " Apr 21 04:32:47.639918 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.639621 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xwxr\" (UniqueName: \"kubernetes.io/projected/a6ec1d15-a3d0-4783-811b-3f22bdfba652-kube-api-access-7xwxr\") pod \"a6ec1d15-a3d0-4783-811b-3f22bdfba652\" (UID: \"a6ec1d15-a3d0-4783-811b-3f22bdfba652\") " Apr 21 04:32:47.639918 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.639644 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28f6b4ba-638e-46c3-9357-9840f2c1faee-bundle\") pod \"28f6b4ba-638e-46c3-9357-9840f2c1faee\" (UID: \"28f6b4ba-638e-46c3-9357-9840f2c1faee\") " Apr 21 04:32:47.640194 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.640168 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ec1d15-a3d0-4783-811b-3f22bdfba652-bundle" (OuterVolumeSpecName: "bundle") pod "a6ec1d15-a3d0-4783-811b-3f22bdfba652" (UID: "a6ec1d15-a3d0-4783-811b-3f22bdfba652"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:32:47.640323 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.640292 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28f6b4ba-638e-46c3-9357-9840f2c1faee-bundle" (OuterVolumeSpecName: "bundle") pod "28f6b4ba-638e-46c3-9357-9840f2c1faee" (UID: "28f6b4ba-638e-46c3-9357-9840f2c1faee"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:32:47.641794 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.641764 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f6b4ba-638e-46c3-9357-9840f2c1faee-kube-api-access-nlthz" (OuterVolumeSpecName: "kube-api-access-nlthz") pod "28f6b4ba-638e-46c3-9357-9840f2c1faee" (UID: "28f6b4ba-638e-46c3-9357-9840f2c1faee"). InnerVolumeSpecName "kube-api-access-nlthz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:32:47.641919 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.641790 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ec1d15-a3d0-4783-811b-3f22bdfba652-kube-api-access-7xwxr" (OuterVolumeSpecName: "kube-api-access-7xwxr") pod "a6ec1d15-a3d0-4783-811b-3f22bdfba652" (UID: "a6ec1d15-a3d0-4783-811b-3f22bdfba652"). InnerVolumeSpecName "kube-api-access-7xwxr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:32:47.645061 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.645040 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ec1d15-a3d0-4783-811b-3f22bdfba652-util" (OuterVolumeSpecName: "util") pod "a6ec1d15-a3d0-4783-811b-3f22bdfba652" (UID: "a6ec1d15-a3d0-4783-811b-3f22bdfba652"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:32:47.647761 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.647735 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28f6b4ba-638e-46c3-9357-9840f2c1faee-util" (OuterVolumeSpecName: "util") pod "28f6b4ba-638e-46c3-9357-9840f2c1faee" (UID: "28f6b4ba-638e-46c3-9357-9840f2c1faee"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:32:47.740991 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.740955 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nlthz\" (UniqueName: \"kubernetes.io/projected/28f6b4ba-638e-46c3-9357-9840f2c1faee-kube-api-access-nlthz\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:32:47.740991 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.740984 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28f6b4ba-638e-46c3-9357-9840f2c1faee-util\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:32:47.740991 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.740994 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6ec1d15-a3d0-4783-811b-3f22bdfba652-util\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:32:47.740991 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.741003 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6ec1d15-a3d0-4783-811b-3f22bdfba652-bundle\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:32:47.741254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.741011 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7xwxr\" (UniqueName: \"kubernetes.io/projected/a6ec1d15-a3d0-4783-811b-3f22bdfba652-kube-api-access-7xwxr\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:32:47.741254 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:47.741021 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28f6b4ba-638e-46c3-9357-9840f2c1faee-bundle\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:32:48.373363 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:48.373324 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" event={"ID":"a6ec1d15-a3d0-4783-811b-3f22bdfba652","Type":"ContainerDied","Data":"efa3134b07c1e27565bf500b2e7fce406a527ab8ea8ae6b5636b82868cff5534"} Apr 21 04:32:48.373363 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:48.373365 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efa3134b07c1e27565bf500b2e7fce406a527ab8ea8ae6b5636b82868cff5534" Apr 21 04:32:48.373844 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:48.373345 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4" Apr 21 04:32:48.375204 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:48.375172 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" event={"ID":"28f6b4ba-638e-46c3-9357-9840f2c1faee","Type":"ContainerDied","Data":"b8b7c1fccf69f352cfb0352163a98b4ab829a22f18f4d10da6f8564c82c6b27e"} Apr 21 04:32:48.375354 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:48.375208 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8b7c1fccf69f352cfb0352163a98b4ab829a22f18f4d10da6f8564c82c6b27e" Apr 21 04:32:48.375354 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:32:48.375264 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml" Apr 21 04:33:01.821643 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.821608 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-7d4tm"] Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.821898 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28f6b4ba-638e-46c3-9357-9840f2c1faee" containerName="util" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.821909 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f6b4ba-638e-46c3-9357-9840f2c1faee" containerName="util" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.821920 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be30dc22-b0e8-468c-92aa-a37020853240" containerName="util" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.821925 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="be30dc22-b0e8-468c-92aa-a37020853240" containerName="util" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.821934 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6ec1d15-a3d0-4783-811b-3f22bdfba652" containerName="util" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.821939 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ec1d15-a3d0-4783-811b-3f22bdfba652" containerName="util" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.821950 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be30dc22-b0e8-468c-92aa-a37020853240" containerName="pull" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.821955 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="be30dc22-b0e8-468c-92aa-a37020853240" containerName="pull" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.821960 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73b4592f-9384-4b6f-a122-ad4317c5ffb0" containerName="pull" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.821966 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b4592f-9384-4b6f-a122-ad4317c5ffb0" containerName="pull" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.821972 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73b4592f-9384-4b6f-a122-ad4317c5ffb0" containerName="extract" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.821977 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b4592f-9384-4b6f-a122-ad4317c5ffb0" containerName="extract" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.821983 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be30dc22-b0e8-468c-92aa-a37020853240" containerName="extract" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.821988 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="be30dc22-b0e8-468c-92aa-a37020853240" containerName="extract" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.821994 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6ec1d15-a3d0-4783-811b-3f22bdfba652" containerName="pull" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.821999 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ec1d15-a3d0-4783-811b-3f22bdfba652" containerName="pull" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.822005 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28f6b4ba-638e-46c3-9357-9840f2c1faee" containerName="pull" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.822010 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f6b4ba-638e-46c3-9357-9840f2c1faee" containerName="pull" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.822016 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6ec1d15-a3d0-4783-811b-3f22bdfba652" containerName="extract" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.822021 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ec1d15-a3d0-4783-811b-3f22bdfba652" containerName="extract" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.822026 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28f6b4ba-638e-46c3-9357-9840f2c1faee" containerName="extract" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.822031 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f6b4ba-638e-46c3-9357-9840f2c1faee" containerName="extract" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.822037 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73b4592f-9384-4b6f-a122-ad4317c5ffb0" containerName="util" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.822042 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b4592f-9384-4b6f-a122-ad4317c5ffb0" containerName="util" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.822079 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="73b4592f-9384-4b6f-a122-ad4317c5ffb0" containerName="extract" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.822087 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="28f6b4ba-638e-46c3-9357-9840f2c1faee" containerName="extract" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.822093 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="be30dc22-b0e8-468c-92aa-a37020853240" containerName="extract" Apr 21 04:33:01.822098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.822100 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6ec1d15-a3d0-4783-811b-3f22bdfba652" containerName="extract" Apr 21 04:33:01.825026 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.825010 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7d4tm" Apr 21 04:33:01.827945 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.827927 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 21 04:33:01.828023 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.827990 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-9fm2x\"" Apr 21 04:33:01.838388 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.838367 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-7d4tm"] Apr 21 04:33:01.949446 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:01.949413 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkgsw\" (UniqueName: \"kubernetes.io/projected/62de46c9-dd26-484d-a95a-9a697df9677c-kube-api-access-qkgsw\") pod \"dns-operator-controller-manager-648d5c98bc-7d4tm\" (UID: \"62de46c9-dd26-484d-a95a-9a697df9677c\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7d4tm" Apr 21 04:33:02.050043 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:02.050008 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkgsw\" (UniqueName: \"kubernetes.io/projected/62de46c9-dd26-484d-a95a-9a697df9677c-kube-api-access-qkgsw\") pod \"dns-operator-controller-manager-648d5c98bc-7d4tm\" (UID: \"62de46c9-dd26-484d-a95a-9a697df9677c\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7d4tm" Apr 21 04:33:02.071418 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:02.071385 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkgsw\" (UniqueName: \"kubernetes.io/projected/62de46c9-dd26-484d-a95a-9a697df9677c-kube-api-access-qkgsw\") pod \"dns-operator-controller-manager-648d5c98bc-7d4tm\" (UID: \"62de46c9-dd26-484d-a95a-9a697df9677c\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7d4tm" Apr 21 04:33:02.134767 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:02.134710 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7d4tm" Apr 21 04:33:02.259701 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:02.259663 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-7d4tm"] Apr 21 04:33:02.263257 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:33:02.263229 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62de46c9_dd26_484d_a95a_9a697df9677c.slice/crio-a1b133ba1dc3aa9a28a09173dc8736462c044a6bca49801bb06771cde528967d WatchSource:0}: Error finding container a1b133ba1dc3aa9a28a09173dc8736462c044a6bca49801bb06771cde528967d: Status 404 returned error can't find the container with id a1b133ba1dc3aa9a28a09173dc8736462c044a6bca49801bb06771cde528967d Apr 21 04:33:02.425636 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:02.425554 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7d4tm" event={"ID":"62de46c9-dd26-484d-a95a-9a697df9677c","Type":"ContainerStarted","Data":"a1b133ba1dc3aa9a28a09173dc8736462c044a6bca49801bb06771cde528967d"} Apr 21 04:33:05.438250 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:05.438218 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7d4tm" event={"ID":"62de46c9-dd26-484d-a95a-9a697df9677c","Type":"ContainerStarted","Data":"68a17355d7955256457e5c5174d8755c6d70ab2abce3d4666f58f983033ec3c6"} Apr 21 04:33:05.438611 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:05.438271 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7d4tm" Apr 21 04:33:05.454777 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:05.454710 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7d4tm" podStartSLOduration=1.660186808 podStartE2EDuration="4.454695685s" podCreationTimestamp="2026-04-21 04:33:01 +0000 UTC" firstStartedPulling="2026-04-21 04:33:02.265097641 +0000 UTC m=+561.125201599" lastFinishedPulling="2026-04-21 04:33:05.059606505 +0000 UTC m=+563.919710476" observedRunningTime="2026-04-21 04:33:05.452990461 +0000 UTC m=+564.313094465" watchObservedRunningTime="2026-04-21 04:33:05.454695685 +0000 UTC m=+564.314799730" Apr 21 04:33:06.551217 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:06.551182 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd"] Apr 21 04:33:06.553025 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:06.553009 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" Apr 21 04:33:06.555886 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:06.555864 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-vc2h7\"" Apr 21 04:33:06.610345 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:06.610312 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd"] Apr 21 04:33:06.688212 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:06.688174 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwhkd\" (UniqueName: \"kubernetes.io/projected/9969c71f-57de-4fc2-8bd4-7f1735d51b3f-kube-api-access-qwhkd\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fbhcd\" (UID: \"9969c71f-57de-4fc2-8bd4-7f1735d51b3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" Apr 21 04:33:06.688212 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:06.688219 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9969c71f-57de-4fc2-8bd4-7f1735d51b3f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fbhcd\" (UID: \"9969c71f-57de-4fc2-8bd4-7f1735d51b3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" Apr 21 04:33:06.788974 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:06.788921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwhkd\" (UniqueName: \"kubernetes.io/projected/9969c71f-57de-4fc2-8bd4-7f1735d51b3f-kube-api-access-qwhkd\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fbhcd\" (UID: \"9969c71f-57de-4fc2-8bd4-7f1735d51b3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" Apr 21 04:33:06.788974 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:06.788976 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9969c71f-57de-4fc2-8bd4-7f1735d51b3f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fbhcd\" (UID: \"9969c71f-57de-4fc2-8bd4-7f1735d51b3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" Apr 21 04:33:06.789307 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:06.789290 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9969c71f-57de-4fc2-8bd4-7f1735d51b3f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fbhcd\" (UID: \"9969c71f-57de-4fc2-8bd4-7f1735d51b3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" Apr 21 04:33:06.797228 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:06.797197 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwhkd\" (UniqueName: \"kubernetes.io/projected/9969c71f-57de-4fc2-8bd4-7f1735d51b3f-kube-api-access-qwhkd\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fbhcd\" (UID: \"9969c71f-57de-4fc2-8bd4-7f1735d51b3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" Apr 21 04:33:06.862706 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:06.862677 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" Apr 21 04:33:07.046314 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:07.046288 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd"] Apr 21 04:33:07.048621 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:33:07.048594 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9969c71f_57de_4fc2_8bd4_7f1735d51b3f.slice/crio-918c558c962e49bf517d3c781cbc4c63e048ae70eea4e29263518bc8c5dcc954 WatchSource:0}: Error finding container 918c558c962e49bf517d3c781cbc4c63e048ae70eea4e29263518bc8c5dcc954: Status 404 returned error can't find the container with id 918c558c962e49bf517d3c781cbc4c63e048ae70eea4e29263518bc8c5dcc954 Apr 21 04:33:07.445693 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:07.445656 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" event={"ID":"9969c71f-57de-4fc2-8bd4-7f1735d51b3f","Type":"ContainerStarted","Data":"918c558c962e49bf517d3c781cbc4c63e048ae70eea4e29263518bc8c5dcc954"} Apr 21 04:33:08.621770 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:08.621738 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs"] Apr 21 04:33:08.623844 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:08.623826 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" Apr 21 04:33:08.626240 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:08.626223 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-tp2zw\"" Apr 21 04:33:08.634810 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:08.634784 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs"] Apr 21 04:33:08.806492 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:08.806455 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9swl\" (UniqueName: \"kubernetes.io/projected/1b06078e-bd93-4c76-afba-8d6cedd99fe7-kube-api-access-z9swl\") pod \"limitador-operator-controller-manager-85c4996f8c-qk5rs\" (UID: \"1b06078e-bd93-4c76-afba-8d6cedd99fe7\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" Apr 21 04:33:08.907205 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:08.907118 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9swl\" (UniqueName: \"kubernetes.io/projected/1b06078e-bd93-4c76-afba-8d6cedd99fe7-kube-api-access-z9swl\") pod \"limitador-operator-controller-manager-85c4996f8c-qk5rs\" (UID: \"1b06078e-bd93-4c76-afba-8d6cedd99fe7\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" Apr 21 04:33:08.917879 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:08.917847 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9swl\" (UniqueName: \"kubernetes.io/projected/1b06078e-bd93-4c76-afba-8d6cedd99fe7-kube-api-access-z9swl\") pod \"limitador-operator-controller-manager-85c4996f8c-qk5rs\" (UID: \"1b06078e-bd93-4c76-afba-8d6cedd99fe7\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" Apr 21 04:33:08.935125 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:08.935096 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" Apr 21 04:33:09.057338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:09.057312 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs"] Apr 21 04:33:09.058817 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:33:09.058789 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b06078e_bd93_4c76_afba_8d6cedd99fe7.slice/crio-115f15e1335b44aa58b3370b2da9830547577edf7c9e279447b1441ed2141378 WatchSource:0}: Error finding container 115f15e1335b44aa58b3370b2da9830547577edf7c9e279447b1441ed2141378: Status 404 returned error can't find the container with id 115f15e1335b44aa58b3370b2da9830547577edf7c9e279447b1441ed2141378 Apr 21 04:33:09.454829 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:09.454788 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" event={"ID":"1b06078e-bd93-4c76-afba-8d6cedd99fe7","Type":"ContainerStarted","Data":"115f15e1335b44aa58b3370b2da9830547577edf7c9e279447b1441ed2141378"} Apr 21 04:33:12.468963 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:12.468922 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" event={"ID":"1b06078e-bd93-4c76-afba-8d6cedd99fe7","Type":"ContainerStarted","Data":"3c440150f24a0abf0c63f8a24cac07e0cb686beffd69d5fa46214f8bd95524cf"} Apr 21 04:33:12.469443 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:12.469056 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" Apr 21 04:33:12.470450 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:12.470425 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" event={"ID":"9969c71f-57de-4fc2-8bd4-7f1735d51b3f","Type":"ContainerStarted","Data":"570b811c8dbfd457d9eac38f0229aa237e683f0f948aa1de08820582da54d8b3"} Apr 21 04:33:12.470583 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:12.470564 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" Apr 21 04:33:12.488165 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:12.488116 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" podStartSLOduration=1.556919995 podStartE2EDuration="4.488102399s" podCreationTimestamp="2026-04-21 04:33:08 +0000 UTC" firstStartedPulling="2026-04-21 04:33:09.061383298 +0000 UTC m=+567.921487255" lastFinishedPulling="2026-04-21 04:33:11.992565686 +0000 UTC m=+570.852669659" observedRunningTime="2026-04-21 04:33:12.487526608 +0000 UTC m=+571.347630587" watchObservedRunningTime="2026-04-21 04:33:12.488102399 +0000 UTC m=+571.348206378" Apr 21 04:33:12.514706 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:12.514655 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" podStartSLOduration=1.5666642309999999 podStartE2EDuration="6.514642313s" podCreationTimestamp="2026-04-21 04:33:06 +0000 UTC" firstStartedPulling="2026-04-21 04:33:07.051158438 +0000 UTC m=+565.911262396" lastFinishedPulling="2026-04-21 04:33:11.999136504 +0000 UTC m=+570.859240478" observedRunningTime="2026-04-21 04:33:12.51304678 +0000 UTC m=+571.373150759" watchObservedRunningTime="2026-04-21 04:33:12.514642313 +0000 UTC m=+571.374746291" Apr 21 04:33:16.444016 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:16.443984 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-7d4tm" Apr 21 04:33:23.479133 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:23.479079 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" Apr 21 04:33:23.479604 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:23.479313 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" Apr 21 04:33:25.121063 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.121027 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd"] Apr 21 04:33:25.121588 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.121233 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" podUID="9969c71f-57de-4fc2-8bd4-7f1735d51b3f" containerName="manager" containerID="cri-o://570b811c8dbfd457d9eac38f0229aa237e683f0f948aa1de08820582da54d8b3" gracePeriod=2 Apr 21 04:33:25.123182 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.123151 2574 status_manager.go:895] "Failed to get status for pod" podUID="9969c71f-57de-4fc2-8bd4-7f1735d51b3f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fbhcd\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.124195 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.124166 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd"] Apr 21 04:33:25.141118 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.141089 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q"] Apr 21 04:33:25.141428 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.141411 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9969c71f-57de-4fc2-8bd4-7f1735d51b3f" containerName="manager" Apr 21 04:33:25.141507 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.141431 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9969c71f-57de-4fc2-8bd4-7f1735d51b3f" containerName="manager" Apr 21 04:33:25.141558 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.141525 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9969c71f-57de-4fc2-8bd4-7f1735d51b3f" containerName="manager" Apr 21 04:33:25.144525 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.144503 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" Apr 21 04:33:25.146650 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.146618 2574 status_manager.go:895] "Failed to get status for pod" podUID="9969c71f-57de-4fc2-8bd4-7f1735d51b3f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fbhcd\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.158286 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.158257 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q"] Apr 21 04:33:25.168343 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.168313 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs"] Apr 21 04:33:25.168571 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.168546 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" podUID="1b06078e-bd93-4c76-afba-8d6cedd99fe7" containerName="manager" containerID="cri-o://3c440150f24a0abf0c63f8a24cac07e0cb686beffd69d5fa46214f8bd95524cf" gracePeriod=2 Apr 21 04:33:25.181902 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.181770 2574 status_manager.go:895] "Failed to get status for pod" podUID="9969c71f-57de-4fc2-8bd4-7f1735d51b3f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fbhcd\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.182649 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.182621 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs"] Apr 21 04:33:25.184183 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.184159 2574 status_manager.go:895] "Failed to get status for pod" podUID="1b06078e-bd93-4c76-afba-8d6cedd99fe7" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" err="pods \"limitador-operator-controller-manager-85c4996f8c-qk5rs\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.216398 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.216369 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-wlnws"] Apr 21 04:33:25.216701 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.216686 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b06078e-bd93-4c76-afba-8d6cedd99fe7" containerName="manager" Apr 21 04:33:25.216767 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.216703 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b06078e-bd93-4c76-afba-8d6cedd99fe7" containerName="manager" Apr 21 04:33:25.216810 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.216775 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b06078e-bd93-4c76-afba-8d6cedd99fe7" containerName="manager" Apr 21 04:33:25.219637 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.219613 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-wlnws" Apr 21 04:33:25.219794 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.219609 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/df059d8e-0544-4bb5-84ec-db81527d0213-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pjx8q\" (UID: \"df059d8e-0544-4bb5-84ec-db81527d0213\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" Apr 21 04:33:25.219859 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.219803 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlvkh\" (UniqueName: \"kubernetes.io/projected/df059d8e-0544-4bb5-84ec-db81527d0213-kube-api-access-wlvkh\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pjx8q\" (UID: \"df059d8e-0544-4bb5-84ec-db81527d0213\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" Apr 21 04:33:25.233824 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.233786 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-wlnws"] Apr 21 04:33:25.252448 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.252415 2574 status_manager.go:895] "Failed to get status for pod" podUID="9969c71f-57de-4fc2-8bd4-7f1735d51b3f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fbhcd\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.254417 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.254396 2574 status_manager.go:895] "Failed to get status for pod" podUID="1b06078e-bd93-4c76-afba-8d6cedd99fe7" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" err="pods \"limitador-operator-controller-manager-85c4996f8c-qk5rs\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.321042 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.321005 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/df059d8e-0544-4bb5-84ec-db81527d0213-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pjx8q\" (UID: \"df059d8e-0544-4bb5-84ec-db81527d0213\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" Apr 21 04:33:25.321226 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.321065 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlvkh\" (UniqueName: \"kubernetes.io/projected/df059d8e-0544-4bb5-84ec-db81527d0213-kube-api-access-wlvkh\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pjx8q\" (UID: \"df059d8e-0544-4bb5-84ec-db81527d0213\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" Apr 21 04:33:25.321226 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.321096 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvfb8\" (UniqueName: \"kubernetes.io/projected/0eb69111-0804-4827-97f5-289d251f0d78-kube-api-access-fvfb8\") pod \"limitador-operator-controller-manager-85c4996f8c-wlnws\" (UID: \"0eb69111-0804-4827-97f5-289d251f0d78\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-wlnws" Apr 21 04:33:25.321414 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.321391 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/df059d8e-0544-4bb5-84ec-db81527d0213-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pjx8q\" (UID: \"df059d8e-0544-4bb5-84ec-db81527d0213\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" Apr 21 04:33:25.330106 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.330049 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlvkh\" (UniqueName: \"kubernetes.io/projected/df059d8e-0544-4bb5-84ec-db81527d0213-kube-api-access-wlvkh\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-pjx8q\" (UID: \"df059d8e-0544-4bb5-84ec-db81527d0213\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" Apr 21 04:33:25.397181 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.397160 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" Apr 21 04:33:25.399164 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.399133 2574 status_manager.go:895] "Failed to get status for pod" podUID="9969c71f-57de-4fc2-8bd4-7f1735d51b3f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fbhcd\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.400074 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.400058 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" Apr 21 04:33:25.400762 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.400715 2574 status_manager.go:895] "Failed to get status for pod" podUID="1b06078e-bd93-4c76-afba-8d6cedd99fe7" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" err="pods \"limitador-operator-controller-manager-85c4996f8c-qk5rs\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.402508 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.402489 2574 status_manager.go:895] "Failed to get status for pod" podUID="9969c71f-57de-4fc2-8bd4-7f1735d51b3f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fbhcd\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.404044 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.404025 2574 status_manager.go:895] "Failed to get status for pod" podUID="1b06078e-bd93-4c76-afba-8d6cedd99fe7" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" err="pods \"limitador-operator-controller-manager-85c4996f8c-qk5rs\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.421345 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.421325 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwhkd\" (UniqueName: \"kubernetes.io/projected/9969c71f-57de-4fc2-8bd4-7f1735d51b3f-kube-api-access-qwhkd\") pod \"9969c71f-57de-4fc2-8bd4-7f1735d51b3f\" (UID: \"9969c71f-57de-4fc2-8bd4-7f1735d51b3f\") " Apr 21 04:33:25.421438 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.421365 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9969c71f-57de-4fc2-8bd4-7f1735d51b3f-extensions-socket-volume\") pod \"9969c71f-57de-4fc2-8bd4-7f1735d51b3f\" (UID: \"9969c71f-57de-4fc2-8bd4-7f1735d51b3f\") " Apr 21 04:33:25.421438 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.421400 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9swl\" (UniqueName: \"kubernetes.io/projected/1b06078e-bd93-4c76-afba-8d6cedd99fe7-kube-api-access-z9swl\") pod \"1b06078e-bd93-4c76-afba-8d6cedd99fe7\" (UID: \"1b06078e-bd93-4c76-afba-8d6cedd99fe7\") " Apr 21 04:33:25.421552 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.421533 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvfb8\" (UniqueName: \"kubernetes.io/projected/0eb69111-0804-4827-97f5-289d251f0d78-kube-api-access-fvfb8\") pod \"limitador-operator-controller-manager-85c4996f8c-wlnws\" (UID: \"0eb69111-0804-4827-97f5-289d251f0d78\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-wlnws" Apr 21 04:33:25.421975 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.421952 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9969c71f-57de-4fc2-8bd4-7f1735d51b3f-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "9969c71f-57de-4fc2-8bd4-7f1735d51b3f" (UID: "9969c71f-57de-4fc2-8bd4-7f1735d51b3f"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:33:25.423432 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.423413 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b06078e-bd93-4c76-afba-8d6cedd99fe7-kube-api-access-z9swl" (OuterVolumeSpecName: "kube-api-access-z9swl") pod "1b06078e-bd93-4c76-afba-8d6cedd99fe7" (UID: "1b06078e-bd93-4c76-afba-8d6cedd99fe7"). InnerVolumeSpecName "kube-api-access-z9swl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:33:25.423504 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.423438 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9969c71f-57de-4fc2-8bd4-7f1735d51b3f-kube-api-access-qwhkd" (OuterVolumeSpecName: "kube-api-access-qwhkd") pod "9969c71f-57de-4fc2-8bd4-7f1735d51b3f" (UID: "9969c71f-57de-4fc2-8bd4-7f1735d51b3f"). InnerVolumeSpecName "kube-api-access-qwhkd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:33:25.434666 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.434638 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvfb8\" (UniqueName: \"kubernetes.io/projected/0eb69111-0804-4827-97f5-289d251f0d78-kube-api-access-fvfb8\") pod \"limitador-operator-controller-manager-85c4996f8c-wlnws\" (UID: \"0eb69111-0804-4827-97f5-289d251f0d78\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-wlnws" Apr 21 04:33:25.518478 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.518444 2574 generic.go:358] "Generic (PLEG): container finished" podID="1b06078e-bd93-4c76-afba-8d6cedd99fe7" containerID="3c440150f24a0abf0c63f8a24cac07e0cb686beffd69d5fa46214f8bd95524cf" exitCode=0 Apr 21 04:33:25.518636 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.518500 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" Apr 21 04:33:25.518636 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.518547 2574 scope.go:117] "RemoveContainer" containerID="3c440150f24a0abf0c63f8a24cac07e0cb686beffd69d5fa46214f8bd95524cf" Apr 21 04:33:25.519845 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.519819 2574 generic.go:358] "Generic (PLEG): container finished" podID="9969c71f-57de-4fc2-8bd4-7f1735d51b3f" containerID="570b811c8dbfd457d9eac38f0229aa237e683f0f948aa1de08820582da54d8b3" exitCode=0 Apr 21 04:33:25.519967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.519856 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" Apr 21 04:33:25.520682 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.520659 2574 status_manager.go:895] "Failed to get status for pod" podUID="9969c71f-57de-4fc2-8bd4-7f1735d51b3f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fbhcd\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.522308 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.522288 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z9swl\" (UniqueName: \"kubernetes.io/projected/1b06078e-bd93-4c76-afba-8d6cedd99fe7-kube-api-access-z9swl\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:33:25.522375 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.522315 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qwhkd\" (UniqueName: \"kubernetes.io/projected/9969c71f-57de-4fc2-8bd4-7f1735d51b3f-kube-api-access-qwhkd\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:33:25.522375 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.522329 2574 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9969c71f-57de-4fc2-8bd4-7f1735d51b3f-extensions-socket-volume\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:33:25.522634 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.522612 2574 status_manager.go:895] "Failed to get status for pod" podUID="1b06078e-bd93-4c76-afba-8d6cedd99fe7" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" err="pods \"limitador-operator-controller-manager-85c4996f8c-qk5rs\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.524424 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.524400 2574 status_manager.go:895] "Failed to get status for pod" podUID="9969c71f-57de-4fc2-8bd4-7f1735d51b3f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fbhcd\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.526153 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.526130 2574 status_manager.go:895] "Failed to get status for pod" podUID="1b06078e-bd93-4c76-afba-8d6cedd99fe7" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" err="pods \"limitador-operator-controller-manager-85c4996f8c-qk5rs\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.528211 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.528195 2574 scope.go:117] "RemoveContainer" containerID="3c440150f24a0abf0c63f8a24cac07e0cb686beffd69d5fa46214f8bd95524cf" Apr 21 04:33:25.528469 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:33:25.528452 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c440150f24a0abf0c63f8a24cac07e0cb686beffd69d5fa46214f8bd95524cf\": container with ID starting with 3c440150f24a0abf0c63f8a24cac07e0cb686beffd69d5fa46214f8bd95524cf not found: ID does not exist" containerID="3c440150f24a0abf0c63f8a24cac07e0cb686beffd69d5fa46214f8bd95524cf" Apr 21 04:33:25.528515 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.528478 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c440150f24a0abf0c63f8a24cac07e0cb686beffd69d5fa46214f8bd95524cf"} err="failed to get container status \"3c440150f24a0abf0c63f8a24cac07e0cb686beffd69d5fa46214f8bd95524cf\": rpc error: code = NotFound desc = could not find container \"3c440150f24a0abf0c63f8a24cac07e0cb686beffd69d5fa46214f8bd95524cf\": container with ID starting with 3c440150f24a0abf0c63f8a24cac07e0cb686beffd69d5fa46214f8bd95524cf not found: ID does not exist" Apr 21 04:33:25.528515 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.528496 2574 scope.go:117] "RemoveContainer" containerID="570b811c8dbfd457d9eac38f0229aa237e683f0f948aa1de08820582da54d8b3" Apr 21 04:33:25.530070 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.530045 2574 status_manager.go:895] "Failed to get status for pod" podUID="9969c71f-57de-4fc2-8bd4-7f1735d51b3f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fbhcd\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.532207 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.532185 2574 status_manager.go:895] "Failed to get status for pod" podUID="1b06078e-bd93-4c76-afba-8d6cedd99fe7" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" err="pods \"limitador-operator-controller-manager-85c4996f8c-qk5rs\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.533939 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.533911 2574 status_manager.go:895] "Failed to get status for pod" podUID="9969c71f-57de-4fc2-8bd4-7f1735d51b3f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fbhcd" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fbhcd\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.535714 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.535683 2574 status_manager.go:895] "Failed to get status for pod" podUID="1b06078e-bd93-4c76-afba-8d6cedd99fe7" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-qk5rs" err="pods \"limitador-operator-controller-manager-85c4996f8c-qk5rs\" is forbidden: User \"system:node:ip-10-0-140-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-163.ec2.internal' and this object" Apr 21 04:33:25.536341 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.536325 2574 scope.go:117] "RemoveContainer" containerID="570b811c8dbfd457d9eac38f0229aa237e683f0f948aa1de08820582da54d8b3" Apr 21 04:33:25.536562 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:33:25.536547 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"570b811c8dbfd457d9eac38f0229aa237e683f0f948aa1de08820582da54d8b3\": container with ID starting with 570b811c8dbfd457d9eac38f0229aa237e683f0f948aa1de08820582da54d8b3 not found: ID does not exist" containerID="570b811c8dbfd457d9eac38f0229aa237e683f0f948aa1de08820582da54d8b3" Apr 21 04:33:25.536605 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.536568 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"570b811c8dbfd457d9eac38f0229aa237e683f0f948aa1de08820582da54d8b3"} err="failed to get container status \"570b811c8dbfd457d9eac38f0229aa237e683f0f948aa1de08820582da54d8b3\": rpc error: code = NotFound desc = could not find container \"570b811c8dbfd457d9eac38f0229aa237e683f0f948aa1de08820582da54d8b3\": container with ID starting with 570b811c8dbfd457d9eac38f0229aa237e683f0f948aa1de08820582da54d8b3 not found: ID does not exist" Apr 21 04:33:25.546798 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.546774 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" Apr 21 04:33:25.553455 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.553433 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-wlnws" Apr 21 04:33:25.693130 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.693088 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b06078e-bd93-4c76-afba-8d6cedd99fe7" path="/var/lib/kubelet/pods/1b06078e-bd93-4c76-afba-8d6cedd99fe7/volumes" Apr 21 04:33:25.693593 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.693577 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9969c71f-57de-4fc2-8bd4-7f1735d51b3f" path="/var/lib/kubelet/pods/9969c71f-57de-4fc2-8bd4-7f1735d51b3f/volumes" Apr 21 04:33:25.693887 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.693850 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q"] Apr 21 04:33:25.698040 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:25.698019 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-wlnws"] Apr 21 04:33:25.700299 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:33:25.700275 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eb69111_0804_4827_97f5_289d251f0d78.slice/crio-90d4ee9ac06bdb4748ae3a9bba28cad848ad9d9a6540b9ad2f4b94b0f01d74ea WatchSource:0}: Error finding container 90d4ee9ac06bdb4748ae3a9bba28cad848ad9d9a6540b9ad2f4b94b0f01d74ea: Status 404 returned error can't find the container with id 90d4ee9ac06bdb4748ae3a9bba28cad848ad9d9a6540b9ad2f4b94b0f01d74ea Apr 21 04:33:26.525004 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:26.524841 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-wlnws" event={"ID":"0eb69111-0804-4827-97f5-289d251f0d78","Type":"ContainerStarted","Data":"9fb01ccddbe26d4c6c42d14a07312fa4bddbcd1afbc0e8cfc07dd8f6d7220cfe"} Apr 21 04:33:26.525004 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:26.524899 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-wlnws" Apr 21 04:33:26.525004 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:26.524914 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-wlnws" event={"ID":"0eb69111-0804-4827-97f5-289d251f0d78","Type":"ContainerStarted","Data":"90d4ee9ac06bdb4748ae3a9bba28cad848ad9d9a6540b9ad2f4b94b0f01d74ea"} Apr 21 04:33:26.528459 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:26.528431 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" event={"ID":"df059d8e-0544-4bb5-84ec-db81527d0213","Type":"ContainerStarted","Data":"cca66f9e150416ce4e858044b7d7e766142ae89a8c261a941e5cd12b3199e20c"} Apr 21 04:33:26.528459 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:26.528462 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" event={"ID":"df059d8e-0544-4bb5-84ec-db81527d0213","Type":"ContainerStarted","Data":"365f918bdb440ef6a22db64fd94e6a4bdfe99acdc0cce69844dd0b73074b511f"} Apr 21 04:33:26.528666 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:26.528569 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" Apr 21 04:33:26.547642 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:26.547595 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-wlnws" podStartSLOduration=1.547581637 podStartE2EDuration="1.547581637s" podCreationTimestamp="2026-04-21 04:33:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:33:26.545774981 +0000 UTC m=+585.405878961" watchObservedRunningTime="2026-04-21 04:33:26.547581637 +0000 UTC m=+585.407685669" Apr 21 04:33:26.566521 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:26.566477 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" podStartSLOduration=1.566462996 podStartE2EDuration="1.566462996s" podCreationTimestamp="2026-04-21 04:33:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:33:26.565238952 +0000 UTC m=+585.425342929" watchObservedRunningTime="2026-04-21 04:33:26.566462996 +0000 UTC m=+585.426566976" Apr 21 04:33:37.534853 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:37.534821 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-wlnws" Apr 21 04:33:37.535228 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:37.534879 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" Apr 21 04:33:40.610493 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:40.610455 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q"] Apr 21 04:33:40.610885 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:40.610645 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" podUID="df059d8e-0544-4bb5-84ec-db81527d0213" containerName="manager" containerID="cri-o://cca66f9e150416ce4e858044b7d7e766142ae89a8c261a941e5cd12b3199e20c" gracePeriod=10 Apr 21 04:33:40.849155 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:40.849134 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" Apr 21 04:33:40.940795 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:40.940690 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlvkh\" (UniqueName: \"kubernetes.io/projected/df059d8e-0544-4bb5-84ec-db81527d0213-kube-api-access-wlvkh\") pod \"df059d8e-0544-4bb5-84ec-db81527d0213\" (UID: \"df059d8e-0544-4bb5-84ec-db81527d0213\") " Apr 21 04:33:40.940795 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:40.940757 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/df059d8e-0544-4bb5-84ec-db81527d0213-extensions-socket-volume\") pod \"df059d8e-0544-4bb5-84ec-db81527d0213\" (UID: \"df059d8e-0544-4bb5-84ec-db81527d0213\") " Apr 21 04:33:40.941097 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:40.941070 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df059d8e-0544-4bb5-84ec-db81527d0213-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "df059d8e-0544-4bb5-84ec-db81527d0213" (UID: "df059d8e-0544-4bb5-84ec-db81527d0213"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:33:40.942757 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:40.942736 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df059d8e-0544-4bb5-84ec-db81527d0213-kube-api-access-wlvkh" (OuterVolumeSpecName: "kube-api-access-wlvkh") pod "df059d8e-0544-4bb5-84ec-db81527d0213" (UID: "df059d8e-0544-4bb5-84ec-db81527d0213"). InnerVolumeSpecName "kube-api-access-wlvkh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:33:41.041338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:41.041301 2574 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/df059d8e-0544-4bb5-84ec-db81527d0213-extensions-socket-volume\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:33:41.041338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:41.041330 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wlvkh\" (UniqueName: \"kubernetes.io/projected/df059d8e-0544-4bb5-84ec-db81527d0213-kube-api-access-wlvkh\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:33:41.581252 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:41.581221 2574 generic.go:358] "Generic (PLEG): container finished" podID="df059d8e-0544-4bb5-84ec-db81527d0213" containerID="cca66f9e150416ce4e858044b7d7e766142ae89a8c261a941e5cd12b3199e20c" exitCode=0 Apr 21 04:33:41.581444 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:41.581279 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" Apr 21 04:33:41.581444 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:41.581304 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" event={"ID":"df059d8e-0544-4bb5-84ec-db81527d0213","Type":"ContainerDied","Data":"cca66f9e150416ce4e858044b7d7e766142ae89a8c261a941e5cd12b3199e20c"} Apr 21 04:33:41.581444 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:41.581345 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q" event={"ID":"df059d8e-0544-4bb5-84ec-db81527d0213","Type":"ContainerDied","Data":"365f918bdb440ef6a22db64fd94e6a4bdfe99acdc0cce69844dd0b73074b511f"} Apr 21 04:33:41.581444 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:41.581365 2574 scope.go:117] "RemoveContainer" containerID="cca66f9e150416ce4e858044b7d7e766142ae89a8c261a941e5cd12b3199e20c" Apr 21 04:33:41.590161 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:41.590140 2574 scope.go:117] "RemoveContainer" containerID="cca66f9e150416ce4e858044b7d7e766142ae89a8c261a941e5cd12b3199e20c" Apr 21 04:33:41.590427 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:33:41.590407 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cca66f9e150416ce4e858044b7d7e766142ae89a8c261a941e5cd12b3199e20c\": container with ID starting with cca66f9e150416ce4e858044b7d7e766142ae89a8c261a941e5cd12b3199e20c not found: ID does not exist" containerID="cca66f9e150416ce4e858044b7d7e766142ae89a8c261a941e5cd12b3199e20c" Apr 21 04:33:41.590495 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:41.590440 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cca66f9e150416ce4e858044b7d7e766142ae89a8c261a941e5cd12b3199e20c"} err="failed to get container status \"cca66f9e150416ce4e858044b7d7e766142ae89a8c261a941e5cd12b3199e20c\": rpc error: code = NotFound desc = could not find container \"cca66f9e150416ce4e858044b7d7e766142ae89a8c261a941e5cd12b3199e20c\": container with ID starting with cca66f9e150416ce4e858044b7d7e766142ae89a8c261a941e5cd12b3199e20c not found: ID does not exist" Apr 21 04:33:41.635519 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:41.635489 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q"] Apr 21 04:33:41.644159 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:41.644136 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-pjx8q"] Apr 21 04:33:41.645499 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:41.645475 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/ovn-acl-logging/0.log" Apr 21 04:33:41.645828 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:41.645809 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/ovn-acl-logging/0.log" Apr 21 04:33:41.687732 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:41.687680 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df059d8e-0544-4bb5-84ec-db81527d0213" path="/var/lib/kubelet/pods/df059d8e-0544-4bb5-84ec-db81527d0213/volumes" Apr 21 04:33:56.838111 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:56.838031 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq"] Apr 21 04:33:56.838537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:56.838306 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df059d8e-0544-4bb5-84ec-db81527d0213" containerName="manager" Apr 21 04:33:56.838537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:56.838316 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="df059d8e-0544-4bb5-84ec-db81527d0213" containerName="manager" Apr 21 04:33:56.838537 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:56.838372 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="df059d8e-0544-4bb5-84ec-db81527d0213" containerName="manager" Apr 21 04:33:56.842847 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:56.842829 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:56.847427 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:56.847405 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-c7nbx\"" Apr 21 04:33:56.858539 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:56.858514 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq"] Apr 21 04:33:56.957111 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:56.957070 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/00e10d29-e52c-4844-ba3d-64f2e59fd441-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:56.957111 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:56.957115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/00e10d29-e52c-4844-ba3d-64f2e59fd441-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:56.957338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:56.957131 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/00e10d29-e52c-4844-ba3d-64f2e59fd441-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:56.957338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:56.957202 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82nkq\" (UniqueName: \"kubernetes.io/projected/00e10d29-e52c-4844-ba3d-64f2e59fd441-kube-api-access-82nkq\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:56.957338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:56.957272 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/00e10d29-e52c-4844-ba3d-64f2e59fd441-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:56.957338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:56.957308 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/00e10d29-e52c-4844-ba3d-64f2e59fd441-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:56.957338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:56.957334 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/00e10d29-e52c-4844-ba3d-64f2e59fd441-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:56.957547 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:56.957360 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/00e10d29-e52c-4844-ba3d-64f2e59fd441-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:56.957547 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:56.957404 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/00e10d29-e52c-4844-ba3d-64f2e59fd441-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.057804 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.057770 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/00e10d29-e52c-4844-ba3d-64f2e59fd441-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.057804 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.057811 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/00e10d29-e52c-4844-ba3d-64f2e59fd441-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.058024 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.057835 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/00e10d29-e52c-4844-ba3d-64f2e59fd441-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.058024 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.057870 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/00e10d29-e52c-4844-ba3d-64f2e59fd441-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.058024 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.057898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/00e10d29-e52c-4844-ba3d-64f2e59fd441-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.058024 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.057950 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/00e10d29-e52c-4844-ba3d-64f2e59fd441-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.058024 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.057984 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/00e10d29-e52c-4844-ba3d-64f2e59fd441-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.058269 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.058074 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/00e10d29-e52c-4844-ba3d-64f2e59fd441-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.058269 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.058170 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82nkq\" (UniqueName: \"kubernetes.io/projected/00e10d29-e52c-4844-ba3d-64f2e59fd441-kube-api-access-82nkq\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.058269 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.058225 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/00e10d29-e52c-4844-ba3d-64f2e59fd441-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.058417 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.058293 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/00e10d29-e52c-4844-ba3d-64f2e59fd441-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.058564 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.058543 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/00e10d29-e52c-4844-ba3d-64f2e59fd441-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.058633 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.058584 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/00e10d29-e52c-4844-ba3d-64f2e59fd441-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.058633 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.058589 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/00e10d29-e52c-4844-ba3d-64f2e59fd441-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.060213 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.060196 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/00e10d29-e52c-4844-ba3d-64f2e59fd441-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.060552 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.060532 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/00e10d29-e52c-4844-ba3d-64f2e59fd441-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.068631 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.068604 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82nkq\" (UniqueName: \"kubernetes.io/projected/00e10d29-e52c-4844-ba3d-64f2e59fd441-kube-api-access-82nkq\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.068863 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.068847 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/00e10d29-e52c-4844-ba3d-64f2e59fd441-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-78vcq\" (UID: \"00e10d29-e52c-4844-ba3d-64f2e59fd441\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.152172 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.152097 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:57.280387 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.280362 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq"] Apr 21 04:33:57.282825 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:33:57.282793 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00e10d29_e52c_4844_ba3d_64f2e59fd441.slice/crio-5ddc9321c7881a22010722e4663edb859e5ed26495c5af0d5d325e26a3d20054 WatchSource:0}: Error finding container 5ddc9321c7881a22010722e4663edb859e5ed26495c5af0d5d325e26a3d20054: Status 404 returned error can't find the container with id 5ddc9321c7881a22010722e4663edb859e5ed26495c5af0d5d325e26a3d20054 Apr 21 04:33:57.284881 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.284850 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 04:33:57.284945 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.284910 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 04:33:57.284945 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.284938 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 04:33:57.633991 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.633953 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" event={"ID":"00e10d29-e52c-4844-ba3d-64f2e59fd441","Type":"ContainerStarted","Data":"57403e31af74ad43467cbab5e4c0e94c9bac527068d3621ec66d098bb5df2e15"} Apr 21 04:33:57.633991 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.633992 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" event={"ID":"00e10d29-e52c-4844-ba3d-64f2e59fd441","Type":"ContainerStarted","Data":"5ddc9321c7881a22010722e4663edb859e5ed26495c5af0d5d325e26a3d20054"} Apr 21 04:33:57.656236 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:57.656190 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" podStartSLOduration=1.656175802 podStartE2EDuration="1.656175802s" podCreationTimestamp="2026-04-21 04:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:33:57.654971062 +0000 UTC m=+616.515075042" watchObservedRunningTime="2026-04-21 04:33:57.656175802 +0000 UTC m=+616.516279781" Apr 21 04:33:58.152311 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:58.152273 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:58.156960 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:58.156935 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:58.638000 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:58.637970 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:33:58.639005 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:33:58.638984 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-78vcq" Apr 21 04:34:16.051652 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.051614 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-hsmbq"] Apr 21 04:34:16.055015 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.054999 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" Apr 21 04:34:16.057379 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.057349 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 04:34:16.057485 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.057391 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-cbpgz\"" Apr 21 04:34:16.062652 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.062626 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-hsmbq"] Apr 21 04:34:16.147608 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.147574 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-hsmbq"] Apr 21 04:34:16.219770 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.216111 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2670eed9-1697-45d1-ac4f-4aac9b63ef2d-config-file\") pod \"limitador-limitador-7d549b5b-hsmbq\" (UID: \"2670eed9-1697-45d1-ac4f-4aac9b63ef2d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" Apr 21 04:34:16.219770 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.216225 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgdx4\" (UniqueName: \"kubernetes.io/projected/2670eed9-1697-45d1-ac4f-4aac9b63ef2d-kube-api-access-rgdx4\") pod \"limitador-limitador-7d549b5b-hsmbq\" (UID: \"2670eed9-1697-45d1-ac4f-4aac9b63ef2d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" Apr 21 04:34:16.316941 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.316855 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgdx4\" (UniqueName: \"kubernetes.io/projected/2670eed9-1697-45d1-ac4f-4aac9b63ef2d-kube-api-access-rgdx4\") pod \"limitador-limitador-7d549b5b-hsmbq\" (UID: \"2670eed9-1697-45d1-ac4f-4aac9b63ef2d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" Apr 21 04:34:16.316941 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.316927 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2670eed9-1697-45d1-ac4f-4aac9b63ef2d-config-file\") pod \"limitador-limitador-7d549b5b-hsmbq\" (UID: \"2670eed9-1697-45d1-ac4f-4aac9b63ef2d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" Apr 21 04:34:16.317538 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.317520 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2670eed9-1697-45d1-ac4f-4aac9b63ef2d-config-file\") pod \"limitador-limitador-7d549b5b-hsmbq\" (UID: \"2670eed9-1697-45d1-ac4f-4aac9b63ef2d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" Apr 21 04:34:16.325485 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.325456 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgdx4\" (UniqueName: \"kubernetes.io/projected/2670eed9-1697-45d1-ac4f-4aac9b63ef2d-kube-api-access-rgdx4\") pod \"limitador-limitador-7d549b5b-hsmbq\" (UID: \"2670eed9-1697-45d1-ac4f-4aac9b63ef2d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" Apr 21 04:34:16.366431 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.366396 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" Apr 21 04:34:16.495235 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.495213 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-hsmbq"] Apr 21 04:34:16.496998 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:34:16.496965 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2670eed9_1697_45d1_ac4f_4aac9b63ef2d.slice/crio-ce4fa00f521143e5eec35b4cb3eb853789845a636d513d29e06bc27197e203f2 WatchSource:0}: Error finding container ce4fa00f521143e5eec35b4cb3eb853789845a636d513d29e06bc27197e203f2: Status 404 returned error can't find the container with id ce4fa00f521143e5eec35b4cb3eb853789845a636d513d29e06bc27197e203f2 Apr 21 04:34:16.702845 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.702810 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" event={"ID":"2670eed9-1697-45d1-ac4f-4aac9b63ef2d","Type":"ContainerStarted","Data":"ce4fa00f521143e5eec35b4cb3eb853789845a636d513d29e06bc27197e203f2"} Apr 21 04:34:16.977744 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.977653 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-zrlld"] Apr 21 04:34:16.982167 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.982144 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-zrlld" Apr 21 04:34:16.984621 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.984600 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-tzgn6\"" Apr 21 04:34:16.989894 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:16.989860 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-zrlld"] Apr 21 04:34:17.122937 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:17.122900 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz8q5\" (UniqueName: \"kubernetes.io/projected/ec7c6dc7-ccf5-4634-9d9f-477ebb03feca-kube-api-access-mz8q5\") pod \"authorino-f99f4b5cd-zrlld\" (UID: \"ec7c6dc7-ccf5-4634-9d9f-477ebb03feca\") " pod="kuadrant-system/authorino-f99f4b5cd-zrlld" Apr 21 04:34:17.223990 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:17.223892 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mz8q5\" (UniqueName: \"kubernetes.io/projected/ec7c6dc7-ccf5-4634-9d9f-477ebb03feca-kube-api-access-mz8q5\") pod \"authorino-f99f4b5cd-zrlld\" (UID: \"ec7c6dc7-ccf5-4634-9d9f-477ebb03feca\") " pod="kuadrant-system/authorino-f99f4b5cd-zrlld" Apr 21 04:34:17.235702 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:17.235583 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz8q5\" (UniqueName: \"kubernetes.io/projected/ec7c6dc7-ccf5-4634-9d9f-477ebb03feca-kube-api-access-mz8q5\") pod \"authorino-f99f4b5cd-zrlld\" (UID: \"ec7c6dc7-ccf5-4634-9d9f-477ebb03feca\") " pod="kuadrant-system/authorino-f99f4b5cd-zrlld" Apr 21 04:34:17.293107 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:17.293071 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-zrlld" Apr 21 04:34:17.459050 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:17.459023 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-zrlld"] Apr 21 04:34:17.461697 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:34:17.461664 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec7c6dc7_ccf5_4634_9d9f_477ebb03feca.slice/crio-7831ca62a1dacf0026e8cc2cbc30db49a05b0314d0ae6af7d35ce0c527943809 WatchSource:0}: Error finding container 7831ca62a1dacf0026e8cc2cbc30db49a05b0314d0ae6af7d35ce0c527943809: Status 404 returned error can't find the container with id 7831ca62a1dacf0026e8cc2cbc30db49a05b0314d0ae6af7d35ce0c527943809 Apr 21 04:34:17.708632 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:17.708599 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-zrlld" event={"ID":"ec7c6dc7-ccf5-4634-9d9f-477ebb03feca","Type":"ContainerStarted","Data":"7831ca62a1dacf0026e8cc2cbc30db49a05b0314d0ae6af7d35ce0c527943809"} Apr 21 04:34:19.720490 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:19.720455 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" event={"ID":"2670eed9-1697-45d1-ac4f-4aac9b63ef2d","Type":"ContainerStarted","Data":"2cb7bcd3ae897d11296c8a9e39016fdee9e21ff11365b0357bdb4edd393e113d"} Apr 21 04:34:19.720982 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:19.720604 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" Apr 21 04:34:19.742709 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:19.742646 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" podStartSLOduration=0.933677926 podStartE2EDuration="3.742625781s" podCreationTimestamp="2026-04-21 04:34:16 +0000 UTC" firstStartedPulling="2026-04-21 04:34:16.498823731 +0000 UTC m=+635.358927702" lastFinishedPulling="2026-04-21 04:34:19.307771594 +0000 UTC m=+638.167875557" observedRunningTime="2026-04-21 04:34:19.740963324 +0000 UTC m=+638.601067303" watchObservedRunningTime="2026-04-21 04:34:19.742625781 +0000 UTC m=+638.602729761" Apr 21 04:34:21.323313 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:21.323265 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-zrlld"] Apr 21 04:34:21.729084 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:21.729049 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-zrlld" event={"ID":"ec7c6dc7-ccf5-4634-9d9f-477ebb03feca","Type":"ContainerStarted","Data":"afc979e3d3fdab48b056baa9d685481676a695368d77b5e7d38284317325331c"} Apr 21 04:34:21.779962 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:21.779914 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-zrlld" podStartSLOduration=2.029069905 podStartE2EDuration="5.779898774s" podCreationTimestamp="2026-04-21 04:34:16 +0000 UTC" firstStartedPulling="2026-04-21 04:34:17.463505011 +0000 UTC m=+636.323608987" lastFinishedPulling="2026-04-21 04:34:21.214333896 +0000 UTC m=+640.074437856" observedRunningTime="2026-04-21 04:34:21.777618747 +0000 UTC m=+640.637722725" watchObservedRunningTime="2026-04-21 04:34:21.779898774 +0000 UTC m=+640.640002752" Apr 21 04:34:22.732919 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:22.732872 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-zrlld" podUID="ec7c6dc7-ccf5-4634-9d9f-477ebb03feca" containerName="authorino" containerID="cri-o://afc979e3d3fdab48b056baa9d685481676a695368d77b5e7d38284317325331c" gracePeriod=30 Apr 21 04:34:22.970176 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:22.970150 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-zrlld" Apr 21 04:34:23.078806 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:23.078772 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz8q5\" (UniqueName: \"kubernetes.io/projected/ec7c6dc7-ccf5-4634-9d9f-477ebb03feca-kube-api-access-mz8q5\") pod \"ec7c6dc7-ccf5-4634-9d9f-477ebb03feca\" (UID: \"ec7c6dc7-ccf5-4634-9d9f-477ebb03feca\") " Apr 21 04:34:23.080780 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:23.080752 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec7c6dc7-ccf5-4634-9d9f-477ebb03feca-kube-api-access-mz8q5" (OuterVolumeSpecName: "kube-api-access-mz8q5") pod "ec7c6dc7-ccf5-4634-9d9f-477ebb03feca" (UID: "ec7c6dc7-ccf5-4634-9d9f-477ebb03feca"). InnerVolumeSpecName "kube-api-access-mz8q5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:34:23.179581 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:23.179543 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mz8q5\" (UniqueName: \"kubernetes.io/projected/ec7c6dc7-ccf5-4634-9d9f-477ebb03feca-kube-api-access-mz8q5\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:34:23.737032 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:23.736997 2574 generic.go:358] "Generic (PLEG): container finished" podID="ec7c6dc7-ccf5-4634-9d9f-477ebb03feca" containerID="afc979e3d3fdab48b056baa9d685481676a695368d77b5e7d38284317325331c" exitCode=0 Apr 21 04:34:23.737432 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:23.737047 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-zrlld" Apr 21 04:34:23.737432 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:23.737086 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-zrlld" event={"ID":"ec7c6dc7-ccf5-4634-9d9f-477ebb03feca","Type":"ContainerDied","Data":"afc979e3d3fdab48b056baa9d685481676a695368d77b5e7d38284317325331c"} Apr 21 04:34:23.737432 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:23.737123 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-zrlld" event={"ID":"ec7c6dc7-ccf5-4634-9d9f-477ebb03feca","Type":"ContainerDied","Data":"7831ca62a1dacf0026e8cc2cbc30db49a05b0314d0ae6af7d35ce0c527943809"} Apr 21 04:34:23.737432 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:23.737138 2574 scope.go:117] "RemoveContainer" containerID="afc979e3d3fdab48b056baa9d685481676a695368d77b5e7d38284317325331c" Apr 21 04:34:23.745370 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:23.745353 2574 scope.go:117] "RemoveContainer" containerID="afc979e3d3fdab48b056baa9d685481676a695368d77b5e7d38284317325331c" Apr 21 04:34:23.745621 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:34:23.745605 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afc979e3d3fdab48b056baa9d685481676a695368d77b5e7d38284317325331c\": container with ID starting with afc979e3d3fdab48b056baa9d685481676a695368d77b5e7d38284317325331c not found: ID does not exist" containerID="afc979e3d3fdab48b056baa9d685481676a695368d77b5e7d38284317325331c" Apr 21 04:34:23.745673 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:23.745628 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afc979e3d3fdab48b056baa9d685481676a695368d77b5e7d38284317325331c"} err="failed to get container status \"afc979e3d3fdab48b056baa9d685481676a695368d77b5e7d38284317325331c\": rpc error: code = NotFound desc = could not find container \"afc979e3d3fdab48b056baa9d685481676a695368d77b5e7d38284317325331c\": container with ID starting with afc979e3d3fdab48b056baa9d685481676a695368d77b5e7d38284317325331c not found: ID does not exist" Apr 21 04:34:23.753228 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:23.753202 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-zrlld"] Apr 21 04:34:23.759084 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:23.759053 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-zrlld"] Apr 21 04:34:25.689870 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:25.688134 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec7c6dc7-ccf5-4634-9d9f-477ebb03feca" path="/var/lib/kubelet/pods/ec7c6dc7-ccf5-4634-9d9f-477ebb03feca/volumes" Apr 21 04:34:30.725874 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:30.725842 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" Apr 21 04:34:30.800264 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:30.800225 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-hsmbq"] Apr 21 04:34:30.800463 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:30.800438 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" podUID="2670eed9-1697-45d1-ac4f-4aac9b63ef2d" containerName="limitador" containerID="cri-o://2cb7bcd3ae897d11296c8a9e39016fdee9e21ff11365b0357bdb4edd393e113d" gracePeriod=30 Apr 21 04:34:31.738250 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:31.738228 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" Apr 21 04:34:31.765691 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:31.765657 2574 generic.go:358] "Generic (PLEG): container finished" podID="2670eed9-1697-45d1-ac4f-4aac9b63ef2d" containerID="2cb7bcd3ae897d11296c8a9e39016fdee9e21ff11365b0357bdb4edd393e113d" exitCode=0 Apr 21 04:34:31.765879 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:31.765733 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" event={"ID":"2670eed9-1697-45d1-ac4f-4aac9b63ef2d","Type":"ContainerDied","Data":"2cb7bcd3ae897d11296c8a9e39016fdee9e21ff11365b0357bdb4edd393e113d"} Apr 21 04:34:31.765879 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:31.765766 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" event={"ID":"2670eed9-1697-45d1-ac4f-4aac9b63ef2d","Type":"ContainerDied","Data":"ce4fa00f521143e5eec35b4cb3eb853789845a636d513d29e06bc27197e203f2"} Apr 21 04:34:31.765879 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:31.765784 2574 scope.go:117] "RemoveContainer" containerID="2cb7bcd3ae897d11296c8a9e39016fdee9e21ff11365b0357bdb4edd393e113d" Apr 21 04:34:31.765879 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:31.765807 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-hsmbq" Apr 21 04:34:31.774225 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:31.774212 2574 scope.go:117] "RemoveContainer" containerID="2cb7bcd3ae897d11296c8a9e39016fdee9e21ff11365b0357bdb4edd393e113d" Apr 21 04:34:31.774482 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:34:31.774463 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb7bcd3ae897d11296c8a9e39016fdee9e21ff11365b0357bdb4edd393e113d\": container with ID starting with 2cb7bcd3ae897d11296c8a9e39016fdee9e21ff11365b0357bdb4edd393e113d not found: ID does not exist" containerID="2cb7bcd3ae897d11296c8a9e39016fdee9e21ff11365b0357bdb4edd393e113d" Apr 21 04:34:31.774559 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:31.774490 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb7bcd3ae897d11296c8a9e39016fdee9e21ff11365b0357bdb4edd393e113d"} err="failed to get container status \"2cb7bcd3ae897d11296c8a9e39016fdee9e21ff11365b0357bdb4edd393e113d\": rpc error: code = NotFound desc = could not find container \"2cb7bcd3ae897d11296c8a9e39016fdee9e21ff11365b0357bdb4edd393e113d\": container with ID starting with 2cb7bcd3ae897d11296c8a9e39016fdee9e21ff11365b0357bdb4edd393e113d not found: ID does not exist" Apr 21 04:34:31.847885 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:31.847846 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2670eed9-1697-45d1-ac4f-4aac9b63ef2d-config-file\") pod \"2670eed9-1697-45d1-ac4f-4aac9b63ef2d\" (UID: \"2670eed9-1697-45d1-ac4f-4aac9b63ef2d\") " Apr 21 04:34:31.848051 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:31.847939 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgdx4\" (UniqueName: \"kubernetes.io/projected/2670eed9-1697-45d1-ac4f-4aac9b63ef2d-kube-api-access-rgdx4\") pod \"2670eed9-1697-45d1-ac4f-4aac9b63ef2d\" (UID: \"2670eed9-1697-45d1-ac4f-4aac9b63ef2d\") " Apr 21 04:34:31.848217 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:31.848194 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2670eed9-1697-45d1-ac4f-4aac9b63ef2d-config-file" (OuterVolumeSpecName: "config-file") pod "2670eed9-1697-45d1-ac4f-4aac9b63ef2d" (UID: "2670eed9-1697-45d1-ac4f-4aac9b63ef2d"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:34:31.849969 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:31.849942 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2670eed9-1697-45d1-ac4f-4aac9b63ef2d-kube-api-access-rgdx4" (OuterVolumeSpecName: "kube-api-access-rgdx4") pod "2670eed9-1697-45d1-ac4f-4aac9b63ef2d" (UID: "2670eed9-1697-45d1-ac4f-4aac9b63ef2d"). InnerVolumeSpecName "kube-api-access-rgdx4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:34:31.949122 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:31.949033 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rgdx4\" (UniqueName: \"kubernetes.io/projected/2670eed9-1697-45d1-ac4f-4aac9b63ef2d-kube-api-access-rgdx4\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:34:31.949122 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:31.949068 2574 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2670eed9-1697-45d1-ac4f-4aac9b63ef2d-config-file\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:34:32.086896 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.086865 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-hsmbq"] Apr 21 04:34:32.090751 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.090713 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-hsmbq"] Apr 21 04:34:32.159966 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.159929 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-ccgk7"] Apr 21 04:34:32.160289 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.160273 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2670eed9-1697-45d1-ac4f-4aac9b63ef2d" containerName="limitador" Apr 21 04:34:32.160371 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.160292 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2670eed9-1697-45d1-ac4f-4aac9b63ef2d" containerName="limitador" Apr 21 04:34:32.160371 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.160322 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec7c6dc7-ccf5-4634-9d9f-477ebb03feca" containerName="authorino" Apr 21 04:34:32.160371 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.160331 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec7c6dc7-ccf5-4634-9d9f-477ebb03feca" containerName="authorino" Apr 21 04:34:32.160512 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.160421 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2670eed9-1697-45d1-ac4f-4aac9b63ef2d" containerName="limitador" Apr 21 04:34:32.160512 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.160434 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec7c6dc7-ccf5-4634-9d9f-477ebb03feca" containerName="authorino" Apr 21 04:34:32.163368 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.163342 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-ccgk7" Apr 21 04:34:32.165455 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.165428 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 21 04:34:32.165604 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.165582 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-zt6kf\"" Apr 21 04:34:32.173498 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.173469 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-ccgk7"] Apr 21 04:34:32.352587 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.352559 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ftrm\" (UniqueName: \"kubernetes.io/projected/0fe159dc-e5ab-451b-a362-091d22df0bb9-kube-api-access-5ftrm\") pod \"postgres-868db5846d-ccgk7\" (UID: \"0fe159dc-e5ab-451b-a362-091d22df0bb9\") " pod="opendatahub/postgres-868db5846d-ccgk7" Apr 21 04:34:32.352802 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.352602 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0fe159dc-e5ab-451b-a362-091d22df0bb9-data\") pod \"postgres-868db5846d-ccgk7\" (UID: \"0fe159dc-e5ab-451b-a362-091d22df0bb9\") " pod="opendatahub/postgres-868db5846d-ccgk7" Apr 21 04:34:32.454005 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.453973 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ftrm\" (UniqueName: \"kubernetes.io/projected/0fe159dc-e5ab-451b-a362-091d22df0bb9-kube-api-access-5ftrm\") pod \"postgres-868db5846d-ccgk7\" (UID: \"0fe159dc-e5ab-451b-a362-091d22df0bb9\") " pod="opendatahub/postgres-868db5846d-ccgk7" Apr 21 04:34:32.454199 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.454016 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0fe159dc-e5ab-451b-a362-091d22df0bb9-data\") pod \"postgres-868db5846d-ccgk7\" (UID: \"0fe159dc-e5ab-451b-a362-091d22df0bb9\") " pod="opendatahub/postgres-868db5846d-ccgk7" Apr 21 04:34:32.454413 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.454392 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0fe159dc-e5ab-451b-a362-091d22df0bb9-data\") pod \"postgres-868db5846d-ccgk7\" (UID: \"0fe159dc-e5ab-451b-a362-091d22df0bb9\") " pod="opendatahub/postgres-868db5846d-ccgk7" Apr 21 04:34:32.463387 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.463357 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ftrm\" (UniqueName: \"kubernetes.io/projected/0fe159dc-e5ab-451b-a362-091d22df0bb9-kube-api-access-5ftrm\") pod \"postgres-868db5846d-ccgk7\" (UID: \"0fe159dc-e5ab-451b-a362-091d22df0bb9\") " pod="opendatahub/postgres-868db5846d-ccgk7" Apr 21 04:34:32.475061 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.475039 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-ccgk7" Apr 21 04:34:32.614289 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.614215 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-ccgk7"] Apr 21 04:34:32.620398 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:34:32.620371 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fe159dc_e5ab_451b_a362_091d22df0bb9.slice/crio-e1fc1abd212bec6002c4671184aa9222d6e44b34e8e203716b98b9a42d22de4e WatchSource:0}: Error finding container e1fc1abd212bec6002c4671184aa9222d6e44b34e8e203716b98b9a42d22de4e: Status 404 returned error can't find the container with id e1fc1abd212bec6002c4671184aa9222d6e44b34e8e203716b98b9a42d22de4e Apr 21 04:34:32.770376 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:32.770342 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-ccgk7" event={"ID":"0fe159dc-e5ab-451b-a362-091d22df0bb9","Type":"ContainerStarted","Data":"e1fc1abd212bec6002c4671184aa9222d6e44b34e8e203716b98b9a42d22de4e"} Apr 21 04:34:33.687960 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:33.687925 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2670eed9-1697-45d1-ac4f-4aac9b63ef2d" path="/var/lib/kubelet/pods/2670eed9-1697-45d1-ac4f-4aac9b63ef2d/volumes" Apr 21 04:34:38.796628 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:38.796591 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-ccgk7" event={"ID":"0fe159dc-e5ab-451b-a362-091d22df0bb9","Type":"ContainerStarted","Data":"48154b0933fe9ebf65de4c85b8d6b836d8308a5e9b83465b26378515e8733f22"} Apr 21 04:34:38.796999 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:38.796690 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-ccgk7" Apr 21 04:34:38.816479 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:38.816432 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-ccgk7" podStartSLOduration=1.5246203729999999 podStartE2EDuration="6.81641943s" podCreationTimestamp="2026-04-21 04:34:32 +0000 UTC" firstStartedPulling="2026-04-21 04:34:32.62158246 +0000 UTC m=+651.481686416" lastFinishedPulling="2026-04-21 04:34:37.9133815 +0000 UTC m=+656.773485473" observedRunningTime="2026-04-21 04:34:38.814714677 +0000 UTC m=+657.674818659" watchObservedRunningTime="2026-04-21 04:34:38.81641943 +0000 UTC m=+657.676523408" Apr 21 04:34:44.828398 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:44.828370 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-ccgk7" Apr 21 04:34:46.698022 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:46.697986 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn"] Apr 21 04:34:46.701900 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:46.701884 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" Apr 21 04:34:46.704763 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:46.704738 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:34:46.704945 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:46.704926 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:34:46.705656 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:46.705642 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-h9cjj\"" Apr 21 04:34:46.711958 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:46.711936 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn"] Apr 21 04:34:46.767278 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:46.767250 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvrpb\" (UniqueName: \"kubernetes.io/projected/d774efa7-bbea-4a74-856c-593eff247e57-kube-api-access-gvrpb\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn\" (UID: \"d774efa7-bbea-4a74-856c-593eff247e57\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" Apr 21 04:34:46.767460 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:46.767295 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d774efa7-bbea-4a74-856c-593eff247e57-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn\" (UID: \"d774efa7-bbea-4a74-856c-593eff247e57\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" Apr 21 04:34:46.767460 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:46.767353 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d774efa7-bbea-4a74-856c-593eff247e57-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn\" (UID: \"d774efa7-bbea-4a74-856c-593eff247e57\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" Apr 21 04:34:46.867981 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:46.867944 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvrpb\" (UniqueName: \"kubernetes.io/projected/d774efa7-bbea-4a74-856c-593eff247e57-kube-api-access-gvrpb\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn\" (UID: \"d774efa7-bbea-4a74-856c-593eff247e57\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" Apr 21 04:34:46.868181 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:46.868009 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d774efa7-bbea-4a74-856c-593eff247e57-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn\" (UID: \"d774efa7-bbea-4a74-856c-593eff247e57\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" Apr 21 04:34:46.868181 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:46.868046 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d774efa7-bbea-4a74-856c-593eff247e57-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn\" (UID: \"d774efa7-bbea-4a74-856c-593eff247e57\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" Apr 21 04:34:46.868376 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:46.868357 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d774efa7-bbea-4a74-856c-593eff247e57-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn\" (UID: \"d774efa7-bbea-4a74-856c-593eff247e57\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" Apr 21 04:34:46.868445 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:46.868411 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d774efa7-bbea-4a74-856c-593eff247e57-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn\" (UID: \"d774efa7-bbea-4a74-856c-593eff247e57\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" Apr 21 04:34:46.877900 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:46.877870 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvrpb\" (UniqueName: \"kubernetes.io/projected/d774efa7-bbea-4a74-856c-593eff247e57-kube-api-access-gvrpb\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn\" (UID: \"d774efa7-bbea-4a74-856c-593eff247e57\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" Apr 21 04:34:47.011860 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:47.011771 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" Apr 21 04:34:47.138383 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:47.138350 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn"] Apr 21 04:34:47.142768 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:34:47.142742 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd774efa7_bbea_4a74_856c_593eff247e57.slice/crio-034f70e7803fe5aa18006be62ab4db12a2bfe8b4b03e6423115e83b6b4de6bb3 WatchSource:0}: Error finding container 034f70e7803fe5aa18006be62ab4db12a2bfe8b4b03e6423115e83b6b4de6bb3: Status 404 returned error can't find the container with id 034f70e7803fe5aa18006be62ab4db12a2bfe8b4b03e6423115e83b6b4de6bb3 Apr 21 04:34:47.827416 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:47.827379 2574 generic.go:358] "Generic (PLEG): container finished" podID="d774efa7-bbea-4a74-856c-593eff247e57" containerID="ef4c0d8ab5b99d80fa6dcaf67ba586a09733ced5f37f40936c36e14ba3bc792e" exitCode=0 Apr 21 04:34:47.827812 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:47.827454 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" event={"ID":"d774efa7-bbea-4a74-856c-593eff247e57","Type":"ContainerDied","Data":"ef4c0d8ab5b99d80fa6dcaf67ba586a09733ced5f37f40936c36e14ba3bc792e"} Apr 21 04:34:47.827812 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:47.827494 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" event={"ID":"d774efa7-bbea-4a74-856c-593eff247e57","Type":"ContainerStarted","Data":"034f70e7803fe5aa18006be62ab4db12a2bfe8b4b03e6423115e83b6b4de6bb3"} Apr 21 04:34:48.833170 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:48.833139 2574 generic.go:358] "Generic (PLEG): container finished" podID="d774efa7-bbea-4a74-856c-593eff247e57" containerID="cd3749b15d13bffc0cf7a0d7d85a38c258ac9ff598995f5b67ae00078726ef08" exitCode=0 Apr 21 04:34:48.833561 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:48.833197 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" event={"ID":"d774efa7-bbea-4a74-856c-593eff247e57","Type":"ContainerDied","Data":"cd3749b15d13bffc0cf7a0d7d85a38c258ac9ff598995f5b67ae00078726ef08"} Apr 21 04:34:49.838547 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:49.838515 2574 generic.go:358] "Generic (PLEG): container finished" podID="d774efa7-bbea-4a74-856c-593eff247e57" containerID="7281e22416a08857b22760c70404fa79a5b38bc582b7ae171a209a7990df3a14" exitCode=0 Apr 21 04:34:49.838933 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:49.838553 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" event={"ID":"d774efa7-bbea-4a74-856c-593eff247e57","Type":"ContainerDied","Data":"7281e22416a08857b22760c70404fa79a5b38bc582b7ae171a209a7990df3a14"} Apr 21 04:34:50.960581 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:50.960551 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" Apr 21 04:34:50.999600 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:50.999567 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d774efa7-bbea-4a74-856c-593eff247e57-bundle\") pod \"d774efa7-bbea-4a74-856c-593eff247e57\" (UID: \"d774efa7-bbea-4a74-856c-593eff247e57\") " Apr 21 04:34:50.999815 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:50.999610 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d774efa7-bbea-4a74-856c-593eff247e57-util\") pod \"d774efa7-bbea-4a74-856c-593eff247e57\" (UID: \"d774efa7-bbea-4a74-856c-593eff247e57\") " Apr 21 04:34:50.999815 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:50.999696 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvrpb\" (UniqueName: \"kubernetes.io/projected/d774efa7-bbea-4a74-856c-593eff247e57-kube-api-access-gvrpb\") pod \"d774efa7-bbea-4a74-856c-593eff247e57\" (UID: \"d774efa7-bbea-4a74-856c-593eff247e57\") " Apr 21 04:34:51.000067 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:51.000041 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d774efa7-bbea-4a74-856c-593eff247e57-bundle" (OuterVolumeSpecName: "bundle") pod "d774efa7-bbea-4a74-856c-593eff247e57" (UID: "d774efa7-bbea-4a74-856c-593eff247e57"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:34:51.001745 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:51.001703 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d774efa7-bbea-4a74-856c-593eff247e57-kube-api-access-gvrpb" (OuterVolumeSpecName: "kube-api-access-gvrpb") pod "d774efa7-bbea-4a74-856c-593eff247e57" (UID: "d774efa7-bbea-4a74-856c-593eff247e57"). InnerVolumeSpecName "kube-api-access-gvrpb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:34:51.004825 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:51.004796 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d774efa7-bbea-4a74-856c-593eff247e57-util" (OuterVolumeSpecName: "util") pod "d774efa7-bbea-4a74-856c-593eff247e57" (UID: "d774efa7-bbea-4a74-856c-593eff247e57"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:34:51.100578 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:51.100499 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d774efa7-bbea-4a74-856c-593eff247e57-bundle\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:34:51.100578 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:51.100530 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d774efa7-bbea-4a74-856c-593eff247e57-util\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:34:51.100578 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:51.100540 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gvrpb\" (UniqueName: \"kubernetes.io/projected/d774efa7-bbea-4a74-856c-593eff247e57-kube-api-access-gvrpb\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:34:51.846749 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:51.846700 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" Apr 21 04:34:51.846904 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:51.846702 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350gv9fn" event={"ID":"d774efa7-bbea-4a74-856c-593eff247e57","Type":"ContainerDied","Data":"034f70e7803fe5aa18006be62ab4db12a2bfe8b4b03e6423115e83b6b4de6bb3"} Apr 21 04:34:51.846904 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:34:51.846833 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="034f70e7803fe5aa18006be62ab4db12a2bfe8b4b03e6423115e83b6b4de6bb3" Apr 21 04:35:10.322881 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.322847 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:35:10.323370 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.323167 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d774efa7-bbea-4a74-856c-593eff247e57" containerName="util" Apr 21 04:35:10.323370 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.323182 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d774efa7-bbea-4a74-856c-593eff247e57" containerName="util" Apr 21 04:35:10.323370 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.323194 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d774efa7-bbea-4a74-856c-593eff247e57" containerName="pull" Apr 21 04:35:10.323370 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.323202 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d774efa7-bbea-4a74-856c-593eff247e57" containerName="pull" Apr 21 04:35:10.323370 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.323218 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d774efa7-bbea-4a74-856c-593eff247e57" containerName="extract" Apr 21 04:35:10.323370 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.323224 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d774efa7-bbea-4a74-856c-593eff247e57" containerName="extract" Apr 21 04:35:10.323370 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.323276 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d774efa7-bbea-4a74-856c-593eff247e57" containerName="extract" Apr 21 04:35:10.326058 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.326041 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 04:35:10.328585 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.328559 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 21 04:35:10.328700 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.328602 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 04:35:10.328700 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.328570 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 04:35:10.329315 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.329299 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-p2s9m\"" Apr 21 04:35:10.335016 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.334994 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:35:10.450769 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.450688 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-456gs\" (UniqueName: \"kubernetes.io/projected/f32d77a6-439c-4112-8b10-14a28180dd82-kube-api-access-456gs\") pod \"maas-keycloak-0\" (UID: \"f32d77a6-439c-4112-8b10-14a28180dd82\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:35:10.551844 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.551802 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-456gs\" (UniqueName: \"kubernetes.io/projected/f32d77a6-439c-4112-8b10-14a28180dd82-kube-api-access-456gs\") pod \"maas-keycloak-0\" (UID: \"f32d77a6-439c-4112-8b10-14a28180dd82\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:35:10.564057 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.564022 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-456gs\" (UniqueName: \"kubernetes.io/projected/f32d77a6-439c-4112-8b10-14a28180dd82-kube-api-access-456gs\") pod \"maas-keycloak-0\" (UID: \"f32d77a6-439c-4112-8b10-14a28180dd82\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:35:10.636106 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.636074 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 04:35:10.762166 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.762125 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:35:10.767095 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:35:10.767067 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf32d77a6_439c_4112_8b10_14a28180dd82.slice/crio-9dca34a5f629c12989dc6328be716ce311223a7a22576339310679825de2a243 WatchSource:0}: Error finding container 9dca34a5f629c12989dc6328be716ce311223a7a22576339310679825de2a243: Status 404 returned error can't find the container with id 9dca34a5f629c12989dc6328be716ce311223a7a22576339310679825de2a243 Apr 21 04:35:10.768286 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.768268 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:35:10.916464 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:10.916374 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"f32d77a6-439c-4112-8b10-14a28180dd82","Type":"ContainerStarted","Data":"9dca34a5f629c12989dc6328be716ce311223a7a22576339310679825de2a243"} Apr 21 04:35:15.939299 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:15.939261 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"f32d77a6-439c-4112-8b10-14a28180dd82","Type":"ContainerStarted","Data":"6d0e5fd673454f3a05263361ca6f05ad50c290ddb2be40e4ff6f8cd7c484a244"} Apr 21 04:35:15.959781 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:15.959699 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=0.925746886 podStartE2EDuration="5.95967019s" podCreationTimestamp="2026-04-21 04:35:10 +0000 UTC" firstStartedPulling="2026-04-21 04:35:10.768381983 +0000 UTC m=+689.628485943" lastFinishedPulling="2026-04-21 04:35:15.80230529 +0000 UTC m=+694.662409247" observedRunningTime="2026-04-21 04:35:15.955392133 +0000 UTC m=+694.815496113" watchObservedRunningTime="2026-04-21 04:35:15.95967019 +0000 UTC m=+694.819774226" Apr 21 04:35:16.636420 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:16.636380 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 21 04:35:16.638196 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:16.638164 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.40:9000/health/started\": dial tcp 10.132.0.40:9000: connect: connection refused" Apr 21 04:35:17.636806 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:17.636757 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.40:9000/health/started\": dial tcp 10.132.0.40:9000: connect: connection refused" Apr 21 04:35:18.636504 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:18.636460 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.40:9000/health/started\": dial tcp 10.132.0.40:9000: connect: connection refused" Apr 21 04:35:19.636525 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:19.636479 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.40:9000/health/started\": dial tcp 10.132.0.40:9000: connect: connection refused" Apr 21 04:35:20.636805 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:20.636760 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 21 04:35:20.637401 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:20.637202 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.40:9000/health/started\": dial tcp 10.132.0.40:9000: connect: connection refused" Apr 21 04:35:21.636645 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:21.636595 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.40:9000/health/started\": dial tcp 10.132.0.40:9000: connect: connection refused" Apr 21 04:35:22.637188 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:22.637139 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.40:9000/health/started\": dial tcp 10.132.0.40:9000: connect: connection refused" Apr 21 04:35:23.636948 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:23.636885 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.40:9000/health/started\": dial tcp 10.132.0.40:9000: connect: connection refused" Apr 21 04:35:24.637296 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:24.637243 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.40:9000/health/started\": dial tcp 10.132.0.40:9000: connect: connection refused" Apr 21 04:35:25.636827 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:25.636770 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.40:9000/health/started\": dial tcp 10.132.0.40:9000: connect: connection refused" Apr 21 04:35:26.637561 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:26.637440 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.40:9000/health/started\": dial tcp 10.132.0.40:9000: connect: connection refused" Apr 21 04:35:27.636845 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:27.636799 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.40:9000/health/started\": dial tcp 10.132.0.40:9000: connect: connection refused" Apr 21 04:35:28.637146 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:28.637087 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.40:9000/health/started\": dial tcp 10.132.0.40:9000: connect: connection refused" Apr 21 04:35:29.768620 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:29.768580 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 21 04:35:29.786679 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:29.786635 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:35:39.774860 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:39.774823 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 21 04:35:40.650835 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:40.650797 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-5lgsc"] Apr 21 04:35:40.661275 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:40.661246 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-5lgsc"] Apr 21 04:35:40.661430 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:40.661358 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-5lgsc" Apr 21 04:35:40.663813 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:40.663790 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-tzgn6\"" Apr 21 04:35:40.720311 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:40.720273 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv7d7\" (UniqueName: \"kubernetes.io/projected/32ee4f36-9124-4059-b083-d7bd34c49715-kube-api-access-tv7d7\") pod \"authorino-8b475cf9f-5lgsc\" (UID: \"32ee4f36-9124-4059-b083-d7bd34c49715\") " pod="kuadrant-system/authorino-8b475cf9f-5lgsc" Apr 21 04:35:40.821402 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:40.821363 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tv7d7\" (UniqueName: \"kubernetes.io/projected/32ee4f36-9124-4059-b083-d7bd34c49715-kube-api-access-tv7d7\") pod \"authorino-8b475cf9f-5lgsc\" (UID: \"32ee4f36-9124-4059-b083-d7bd34c49715\") " pod="kuadrant-system/authorino-8b475cf9f-5lgsc" Apr 21 04:35:40.831637 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:40.831610 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv7d7\" (UniqueName: \"kubernetes.io/projected/32ee4f36-9124-4059-b083-d7bd34c49715-kube-api-access-tv7d7\") pod \"authorino-8b475cf9f-5lgsc\" (UID: \"32ee4f36-9124-4059-b083-d7bd34c49715\") " pod="kuadrant-system/authorino-8b475cf9f-5lgsc" Apr 21 04:35:40.882371 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:40.882332 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-5lgsc"] Apr 21 04:35:40.882597 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:40.882581 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-5lgsc" Apr 21 04:35:40.904755 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:40.904677 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7ccc77f495-6sx5t"] Apr 21 04:35:40.933949 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:40.933889 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7ccc77f495-6sx5t"] Apr 21 04:35:40.934110 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:40.934077 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7ccc77f495-6sx5t" Apr 21 04:35:41.002809 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.002777 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-5lgsc"] Apr 21 04:35:41.005457 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:35:41.005431 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32ee4f36_9124_4059_b083_d7bd34c49715.slice/crio-d9b1ee655a44d46bbb9d1eea91a820e3157bf6ff6fc8ffcc3ce461bced23e4c8 WatchSource:0}: Error finding container d9b1ee655a44d46bbb9d1eea91a820e3157bf6ff6fc8ffcc3ce461bced23e4c8: Status 404 returned error can't find the container with id d9b1ee655a44d46bbb9d1eea91a820e3157bf6ff6fc8ffcc3ce461bced23e4c8 Apr 21 04:35:41.023482 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.023457 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skzpq\" (UniqueName: \"kubernetes.io/projected/5f9d9e50-1ab6-46ec-a39c-abf612f8e84c-kube-api-access-skzpq\") pod \"authorino-7ccc77f495-6sx5t\" (UID: \"5f9d9e50-1ab6-46ec-a39c-abf612f8e84c\") " pod="kuadrant-system/authorino-7ccc77f495-6sx5t" Apr 21 04:35:41.065604 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.065568 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-5lgsc" event={"ID":"32ee4f36-9124-4059-b083-d7bd34c49715","Type":"ContainerStarted","Data":"d9b1ee655a44d46bbb9d1eea91a820e3157bf6ff6fc8ffcc3ce461bced23e4c8"} Apr 21 04:35:41.069132 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.069104 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7ccc77f495-6sx5t"] Apr 21 04:35:41.069345 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:35:41.069325 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-skzpq], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-7ccc77f495-6sx5t" podUID="5f9d9e50-1ab6-46ec-a39c-abf612f8e84c" Apr 21 04:35:41.094356 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.094327 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-85599cd679-w2cl5"] Apr 21 04:35:41.124187 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.124156 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skzpq\" (UniqueName: \"kubernetes.io/projected/5f9d9e50-1ab6-46ec-a39c-abf612f8e84c-kube-api-access-skzpq\") pod \"authorino-7ccc77f495-6sx5t\" (UID: \"5f9d9e50-1ab6-46ec-a39c-abf612f8e84c\") " pod="kuadrant-system/authorino-7ccc77f495-6sx5t" Apr 21 04:35:41.125313 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.125292 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-85599cd679-w2cl5"] Apr 21 04:35:41.125430 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.125419 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85599cd679-w2cl5" Apr 21 04:35:41.127608 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.127587 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 21 04:35:41.132060 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.132037 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skzpq\" (UniqueName: \"kubernetes.io/projected/5f9d9e50-1ab6-46ec-a39c-abf612f8e84c-kube-api-access-skzpq\") pod \"authorino-7ccc77f495-6sx5t\" (UID: \"5f9d9e50-1ab6-46ec-a39c-abf612f8e84c\") " pod="kuadrant-system/authorino-7ccc77f495-6sx5t" Apr 21 04:35:41.225274 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.225191 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/afb9faad-14c7-4f6b-be77-109601000770-tls-cert\") pod \"authorino-85599cd679-w2cl5\" (UID: \"afb9faad-14c7-4f6b-be77-109601000770\") " pod="kuadrant-system/authorino-85599cd679-w2cl5" Apr 21 04:35:41.225274 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.225256 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4sk4\" (UniqueName: \"kubernetes.io/projected/afb9faad-14c7-4f6b-be77-109601000770-kube-api-access-d4sk4\") pod \"authorino-85599cd679-w2cl5\" (UID: \"afb9faad-14c7-4f6b-be77-109601000770\") " pod="kuadrant-system/authorino-85599cd679-w2cl5" Apr 21 04:35:41.325660 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.325625 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4sk4\" (UniqueName: \"kubernetes.io/projected/afb9faad-14c7-4f6b-be77-109601000770-kube-api-access-d4sk4\") pod \"authorino-85599cd679-w2cl5\" (UID: \"afb9faad-14c7-4f6b-be77-109601000770\") " pod="kuadrant-system/authorino-85599cd679-w2cl5" Apr 21 04:35:41.325846 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.325694 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/afb9faad-14c7-4f6b-be77-109601000770-tls-cert\") pod \"authorino-85599cd679-w2cl5\" (UID: \"afb9faad-14c7-4f6b-be77-109601000770\") " pod="kuadrant-system/authorino-85599cd679-w2cl5" Apr 21 04:35:41.328047 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.328029 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/afb9faad-14c7-4f6b-be77-109601000770-tls-cert\") pod \"authorino-85599cd679-w2cl5\" (UID: \"afb9faad-14c7-4f6b-be77-109601000770\") " pod="kuadrant-system/authorino-85599cd679-w2cl5" Apr 21 04:35:41.333105 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.333085 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4sk4\" (UniqueName: \"kubernetes.io/projected/afb9faad-14c7-4f6b-be77-109601000770-kube-api-access-d4sk4\") pod \"authorino-85599cd679-w2cl5\" (UID: \"afb9faad-14c7-4f6b-be77-109601000770\") " pod="kuadrant-system/authorino-85599cd679-w2cl5" Apr 21 04:35:41.442280 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.442245 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85599cd679-w2cl5" Apr 21 04:35:41.614769 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:41.614742 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-85599cd679-w2cl5"] Apr 21 04:35:41.616475 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:35:41.616447 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafb9faad_14c7_4f6b_be77_109601000770.slice/crio-1c74d6d8d472f80ac8c6e5ad488a2568a3c43cca4402f8ac3f93b18309c434bc WatchSource:0}: Error finding container 1c74d6d8d472f80ac8c6e5ad488a2568a3c43cca4402f8ac3f93b18309c434bc: Status 404 returned error can't find the container with id 1c74d6d8d472f80ac8c6e5ad488a2568a3c43cca4402f8ac3f93b18309c434bc Apr 21 04:35:42.070633 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:42.070598 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-5lgsc" event={"ID":"32ee4f36-9124-4059-b083-d7bd34c49715","Type":"ContainerStarted","Data":"03d977e3a00d2dcb2e1b152e83b7d489f77ff3ad5672b62df8eda302572d09c2"} Apr 21 04:35:42.071255 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:42.070701 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-5lgsc" podUID="32ee4f36-9124-4059-b083-d7bd34c49715" containerName="authorino" containerID="cri-o://03d977e3a00d2dcb2e1b152e83b7d489f77ff3ad5672b62df8eda302572d09c2" gracePeriod=30 Apr 21 04:35:42.072048 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:42.072021 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7ccc77f495-6sx5t" Apr 21 04:35:42.072161 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:42.072022 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85599cd679-w2cl5" event={"ID":"afb9faad-14c7-4f6b-be77-109601000770","Type":"ContainerStarted","Data":"1c74d6d8d472f80ac8c6e5ad488a2568a3c43cca4402f8ac3f93b18309c434bc"} Apr 21 04:35:42.077259 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:42.077235 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7ccc77f495-6sx5t" Apr 21 04:35:42.084853 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:42.084807 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-5lgsc" podStartSLOduration=1.5976711319999999 podStartE2EDuration="2.084793179s" podCreationTimestamp="2026-04-21 04:35:40 +0000 UTC" firstStartedPulling="2026-04-21 04:35:41.007197019 +0000 UTC m=+719.867300976" lastFinishedPulling="2026-04-21 04:35:41.494319066 +0000 UTC m=+720.354423023" observedRunningTime="2026-04-21 04:35:42.083857564 +0000 UTC m=+720.943961555" watchObservedRunningTime="2026-04-21 04:35:42.084793179 +0000 UTC m=+720.944897159" Apr 21 04:35:42.131041 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:42.131018 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skzpq\" (UniqueName: \"kubernetes.io/projected/5f9d9e50-1ab6-46ec-a39c-abf612f8e84c-kube-api-access-skzpq\") pod \"5f9d9e50-1ab6-46ec-a39c-abf612f8e84c\" (UID: \"5f9d9e50-1ab6-46ec-a39c-abf612f8e84c\") " Apr 21 04:35:42.133022 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:42.133000 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9d9e50-1ab6-46ec-a39c-abf612f8e84c-kube-api-access-skzpq" (OuterVolumeSpecName: "kube-api-access-skzpq") pod "5f9d9e50-1ab6-46ec-a39c-abf612f8e84c" (UID: "5f9d9e50-1ab6-46ec-a39c-abf612f8e84c"). InnerVolumeSpecName "kube-api-access-skzpq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:35:42.231986 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:42.231948 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-skzpq\" (UniqueName: \"kubernetes.io/projected/5f9d9e50-1ab6-46ec-a39c-abf612f8e84c-kube-api-access-skzpq\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:35:42.319513 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:42.319490 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-5lgsc" Apr 21 04:35:42.332181 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:42.332157 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv7d7\" (UniqueName: \"kubernetes.io/projected/32ee4f36-9124-4059-b083-d7bd34c49715-kube-api-access-tv7d7\") pod \"32ee4f36-9124-4059-b083-d7bd34c49715\" (UID: \"32ee4f36-9124-4059-b083-d7bd34c49715\") " Apr 21 04:35:42.334435 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:42.334402 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32ee4f36-9124-4059-b083-d7bd34c49715-kube-api-access-tv7d7" (OuterVolumeSpecName: "kube-api-access-tv7d7") pod "32ee4f36-9124-4059-b083-d7bd34c49715" (UID: "32ee4f36-9124-4059-b083-d7bd34c49715"). InnerVolumeSpecName "kube-api-access-tv7d7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:35:42.432822 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:42.432797 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tv7d7\" (UniqueName: \"kubernetes.io/projected/32ee4f36-9124-4059-b083-d7bd34c49715-kube-api-access-tv7d7\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:35:43.076542 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.076508 2574 generic.go:358] "Generic (PLEG): container finished" podID="32ee4f36-9124-4059-b083-d7bd34c49715" containerID="03d977e3a00d2dcb2e1b152e83b7d489f77ff3ad5672b62df8eda302572d09c2" exitCode=0 Apr 21 04:35:43.076994 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.076559 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-5lgsc" Apr 21 04:35:43.076994 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.076580 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-5lgsc" event={"ID":"32ee4f36-9124-4059-b083-d7bd34c49715","Type":"ContainerDied","Data":"03d977e3a00d2dcb2e1b152e83b7d489f77ff3ad5672b62df8eda302572d09c2"} Apr 21 04:35:43.076994 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.076613 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-5lgsc" event={"ID":"32ee4f36-9124-4059-b083-d7bd34c49715","Type":"ContainerDied","Data":"d9b1ee655a44d46bbb9d1eea91a820e3157bf6ff6fc8ffcc3ce461bced23e4c8"} Apr 21 04:35:43.076994 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.076632 2574 scope.go:117] "RemoveContainer" containerID="03d977e3a00d2dcb2e1b152e83b7d489f77ff3ad5672b62df8eda302572d09c2" Apr 21 04:35:43.078086 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.078059 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85599cd679-w2cl5" event={"ID":"afb9faad-14c7-4f6b-be77-109601000770","Type":"ContainerStarted","Data":"d8fd040005525e4d02b820e982ad74c02124c70b11c620dd8adf9d07cbfafb8f"} Apr 21 04:35:43.078086 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.078075 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7ccc77f495-6sx5t" Apr 21 04:35:43.086365 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.086312 2574 scope.go:117] "RemoveContainer" containerID="03d977e3a00d2dcb2e1b152e83b7d489f77ff3ad5672b62df8eda302572d09c2" Apr 21 04:35:43.086638 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:35:43.086612 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d977e3a00d2dcb2e1b152e83b7d489f77ff3ad5672b62df8eda302572d09c2\": container with ID starting with 03d977e3a00d2dcb2e1b152e83b7d489f77ff3ad5672b62df8eda302572d09c2 not found: ID does not exist" containerID="03d977e3a00d2dcb2e1b152e83b7d489f77ff3ad5672b62df8eda302572d09c2" Apr 21 04:35:43.086920 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.086649 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d977e3a00d2dcb2e1b152e83b7d489f77ff3ad5672b62df8eda302572d09c2"} err="failed to get container status \"03d977e3a00d2dcb2e1b152e83b7d489f77ff3ad5672b62df8eda302572d09c2\": rpc error: code = NotFound desc = could not find container \"03d977e3a00d2dcb2e1b152e83b7d489f77ff3ad5672b62df8eda302572d09c2\": container with ID starting with 03d977e3a00d2dcb2e1b152e83b7d489f77ff3ad5672b62df8eda302572d09c2 not found: ID does not exist" Apr 21 04:35:43.097591 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.097516 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-85599cd679-w2cl5" podStartSLOduration=1.389135878 podStartE2EDuration="2.097505218s" podCreationTimestamp="2026-04-21 04:35:41 +0000 UTC" firstStartedPulling="2026-04-21 04:35:41.617934193 +0000 UTC m=+720.478038151" lastFinishedPulling="2026-04-21 04:35:42.326303521 +0000 UTC m=+721.186407491" observedRunningTime="2026-04-21 04:35:43.094912288 +0000 UTC m=+721.955016266" watchObservedRunningTime="2026-04-21 04:35:43.097505218 +0000 UTC m=+721.957609197" Apr 21 04:35:43.112958 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.112937 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-5lgsc"] Apr 21 04:35:43.119747 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.119705 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-5lgsc"] Apr 21 04:35:43.146624 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.146583 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7ccc77f495-6sx5t"] Apr 21 04:35:43.150603 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.150576 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7ccc77f495-6sx5t"] Apr 21 04:35:43.687828 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.687796 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32ee4f36-9124-4059-b083-d7bd34c49715" path="/var/lib/kubelet/pods/32ee4f36-9124-4059-b083-d7bd34c49715/volumes" Apr 21 04:35:43.688124 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.688112 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f9d9e50-1ab6-46ec-a39c-abf612f8e84c" path="/var/lib/kubelet/pods/5f9d9e50-1ab6-46ec-a39c-abf612f8e84c/volumes" Apr 21 04:35:43.993281 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.993202 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-847968d9df-tnrwh"] Apr 21 04:35:43.993516 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.993504 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32ee4f36-9124-4059-b083-d7bd34c49715" containerName="authorino" Apr 21 04:35:43.993564 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.993517 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ee4f36-9124-4059-b083-d7bd34c49715" containerName="authorino" Apr 21 04:35:43.993599 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.993566 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="32ee4f36-9124-4059-b083-d7bd34c49715" containerName="authorino" Apr 21 04:35:43.998654 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:43.998632 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-847968d9df-tnrwh" Apr 21 04:35:44.001192 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:44.001165 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-l2vw6\"" Apr 21 04:35:44.002928 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:44.002902 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-847968d9df-tnrwh"] Apr 21 04:35:44.139478 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:44.139450 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-f6f645c74-x5m9b"] Apr 21 04:35:44.142968 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:44.142952 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f6f645c74-x5m9b" Apr 21 04:35:44.147531 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:44.147496 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh9k6\" (UniqueName: \"kubernetes.io/projected/deb6fbd1-3b26-434a-a1d3-aa49261e0d90-kube-api-access-jh9k6\") pod \"maas-controller-847968d9df-tnrwh\" (UID: \"deb6fbd1-3b26-434a-a1d3-aa49261e0d90\") " pod="opendatahub/maas-controller-847968d9df-tnrwh" Apr 21 04:35:44.149047 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:44.149023 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f6f645c74-x5m9b"] Apr 21 04:35:44.248714 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:44.248624 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz4nq\" (UniqueName: \"kubernetes.io/projected/531f8c1b-7669-482b-997a-314d224aed15-kube-api-access-qz4nq\") pod \"maas-controller-f6f645c74-x5m9b\" (UID: \"531f8c1b-7669-482b-997a-314d224aed15\") " pod="opendatahub/maas-controller-f6f645c74-x5m9b" Apr 21 04:35:44.248714 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:44.248663 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jh9k6\" (UniqueName: \"kubernetes.io/projected/deb6fbd1-3b26-434a-a1d3-aa49261e0d90-kube-api-access-jh9k6\") pod \"maas-controller-847968d9df-tnrwh\" (UID: \"deb6fbd1-3b26-434a-a1d3-aa49261e0d90\") " pod="opendatahub/maas-controller-847968d9df-tnrwh" Apr 21 04:35:44.257078 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:44.257056 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh9k6\" (UniqueName: \"kubernetes.io/projected/deb6fbd1-3b26-434a-a1d3-aa49261e0d90-kube-api-access-jh9k6\") pod \"maas-controller-847968d9df-tnrwh\" (UID: \"deb6fbd1-3b26-434a-a1d3-aa49261e0d90\") " pod="opendatahub/maas-controller-847968d9df-tnrwh" Apr 21 04:35:44.310915 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:44.310876 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-847968d9df-tnrwh" Apr 21 04:35:44.349982 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:44.349948 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qz4nq\" (UniqueName: \"kubernetes.io/projected/531f8c1b-7669-482b-997a-314d224aed15-kube-api-access-qz4nq\") pod \"maas-controller-f6f645c74-x5m9b\" (UID: \"531f8c1b-7669-482b-997a-314d224aed15\") " pod="opendatahub/maas-controller-f6f645c74-x5m9b" Apr 21 04:35:44.360930 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:44.359294 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz4nq\" (UniqueName: \"kubernetes.io/projected/531f8c1b-7669-482b-997a-314d224aed15-kube-api-access-qz4nq\") pod \"maas-controller-f6f645c74-x5m9b\" (UID: \"531f8c1b-7669-482b-997a-314d224aed15\") " pod="opendatahub/maas-controller-f6f645c74-x5m9b" Apr 21 04:35:44.429945 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:44.429916 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-847968d9df-tnrwh"] Apr 21 04:35:44.431304 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:35:44.431281 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeb6fbd1_3b26_434a_a1d3_aa49261e0d90.slice/crio-af8fdb927a1c106d85667ee49ee27f0d7c2a1b9134a075c8b1762f423b677c15 WatchSource:0}: Error finding container af8fdb927a1c106d85667ee49ee27f0d7c2a1b9134a075c8b1762f423b677c15: Status 404 returned error can't find the container with id af8fdb927a1c106d85667ee49ee27f0d7c2a1b9134a075c8b1762f423b677c15 Apr 21 04:35:44.454464 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:44.454437 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f6f645c74-x5m9b" Apr 21 04:35:44.571398 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:44.571369 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f6f645c74-x5m9b"] Apr 21 04:35:44.572432 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:35:44.572406 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod531f8c1b_7669_482b_997a_314d224aed15.slice/crio-e480d5094a5a7948f7189dfc43df4e1ddc6fd8fd0059201acd84e24a19a61823 WatchSource:0}: Error finding container e480d5094a5a7948f7189dfc43df4e1ddc6fd8fd0059201acd84e24a19a61823: Status 404 returned error can't find the container with id e480d5094a5a7948f7189dfc43df4e1ddc6fd8fd0059201acd84e24a19a61823 Apr 21 04:35:45.088790 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:45.088753 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-847968d9df-tnrwh" event={"ID":"deb6fbd1-3b26-434a-a1d3-aa49261e0d90","Type":"ContainerStarted","Data":"af8fdb927a1c106d85667ee49ee27f0d7c2a1b9134a075c8b1762f423b677c15"} Apr 21 04:35:45.090383 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:45.090345 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f6f645c74-x5m9b" event={"ID":"531f8c1b-7669-482b-997a-314d224aed15","Type":"ContainerStarted","Data":"e480d5094a5a7948f7189dfc43df4e1ddc6fd8fd0059201acd84e24a19a61823"} Apr 21 04:35:48.104123 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:48.104073 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f6f645c74-x5m9b" event={"ID":"531f8c1b-7669-482b-997a-314d224aed15","Type":"ContainerStarted","Data":"c80d95174dd23accc51add34518db4c08555a5e10871e9b1f8d9e1a748298466"} Apr 21 04:35:48.104577 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:48.104222 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-f6f645c74-x5m9b" Apr 21 04:35:48.105428 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:48.105407 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-847968d9df-tnrwh" event={"ID":"deb6fbd1-3b26-434a-a1d3-aa49261e0d90","Type":"ContainerStarted","Data":"097bc530625a9a00ad09def5a1e954660c2b9657c88da4fb314e92b34c79d406"} Apr 21 04:35:48.105563 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:48.105548 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-847968d9df-tnrwh" Apr 21 04:35:48.120079 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:48.120037 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-f6f645c74-x5m9b" podStartSLOduration=1.609378624 podStartE2EDuration="4.120026363s" podCreationTimestamp="2026-04-21 04:35:44 +0000 UTC" firstStartedPulling="2026-04-21 04:35:44.57375708 +0000 UTC m=+723.433861037" lastFinishedPulling="2026-04-21 04:35:47.084404816 +0000 UTC m=+725.944508776" observedRunningTime="2026-04-21 04:35:48.118023923 +0000 UTC m=+726.978127903" watchObservedRunningTime="2026-04-21 04:35:48.120026363 +0000 UTC m=+726.980130342" Apr 21 04:35:48.132704 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:48.132661 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-847968d9df-tnrwh" podStartSLOduration=2.491283602 podStartE2EDuration="5.132650466s" podCreationTimestamp="2026-04-21 04:35:43 +0000 UTC" firstStartedPulling="2026-04-21 04:35:44.432705764 +0000 UTC m=+723.292809720" lastFinishedPulling="2026-04-21 04:35:47.074072612 +0000 UTC m=+725.934176584" observedRunningTime="2026-04-21 04:35:48.131112092 +0000 UTC m=+726.991216072" watchObservedRunningTime="2026-04-21 04:35:48.132650466 +0000 UTC m=+726.992754445" Apr 21 04:35:59.115138 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.115105 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-847968d9df-tnrwh" Apr 21 04:35:59.115564 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.115327 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-f6f645c74-x5m9b" Apr 21 04:35:59.167054 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.167020 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-847968d9df-tnrwh"] Apr 21 04:35:59.167238 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.167211 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-847968d9df-tnrwh" podUID="deb6fbd1-3b26-434a-a1d3-aa49261e0d90" containerName="manager" containerID="cri-o://097bc530625a9a00ad09def5a1e954660c2b9657c88da4fb314e92b34c79d406" gracePeriod=10 Apr 21 04:35:59.410389 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.410369 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-847968d9df-tnrwh" Apr 21 04:35:59.454629 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.454595 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5dc44bd94b-cct56"] Apr 21 04:35:59.454922 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.454910 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="deb6fbd1-3b26-434a-a1d3-aa49261e0d90" containerName="manager" Apr 21 04:35:59.454975 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.454924 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb6fbd1-3b26-434a-a1d3-aa49261e0d90" containerName="manager" Apr 21 04:35:59.455010 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.454980 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="deb6fbd1-3b26-434a-a1d3-aa49261e0d90" containerName="manager" Apr 21 04:35:59.457822 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.457805 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5dc44bd94b-cct56" Apr 21 04:35:59.464583 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.464558 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5dc44bd94b-cct56"] Apr 21 04:35:59.474181 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.474159 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh9k6\" (UniqueName: \"kubernetes.io/projected/deb6fbd1-3b26-434a-a1d3-aa49261e0d90-kube-api-access-jh9k6\") pod \"deb6fbd1-3b26-434a-a1d3-aa49261e0d90\" (UID: \"deb6fbd1-3b26-434a-a1d3-aa49261e0d90\") " Apr 21 04:35:59.476331 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.476302 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb6fbd1-3b26-434a-a1d3-aa49261e0d90-kube-api-access-jh9k6" (OuterVolumeSpecName: "kube-api-access-jh9k6") pod "deb6fbd1-3b26-434a-a1d3-aa49261e0d90" (UID: "deb6fbd1-3b26-434a-a1d3-aa49261e0d90"). InnerVolumeSpecName "kube-api-access-jh9k6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:35:59.575273 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.575232 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg9fz\" (UniqueName: \"kubernetes.io/projected/91342866-b823-44bb-b2fe-98e443f14fc2-kube-api-access-gg9fz\") pod \"maas-controller-5dc44bd94b-cct56\" (UID: \"91342866-b823-44bb-b2fe-98e443f14fc2\") " pod="opendatahub/maas-controller-5dc44bd94b-cct56" Apr 21 04:35:59.575424 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.575324 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jh9k6\" (UniqueName: \"kubernetes.io/projected/deb6fbd1-3b26-434a-a1d3-aa49261e0d90-kube-api-access-jh9k6\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:35:59.675971 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.675885 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gg9fz\" (UniqueName: \"kubernetes.io/projected/91342866-b823-44bb-b2fe-98e443f14fc2-kube-api-access-gg9fz\") pod \"maas-controller-5dc44bd94b-cct56\" (UID: \"91342866-b823-44bb-b2fe-98e443f14fc2\") " pod="opendatahub/maas-controller-5dc44bd94b-cct56" Apr 21 04:35:59.683973 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.683936 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg9fz\" (UniqueName: \"kubernetes.io/projected/91342866-b823-44bb-b2fe-98e443f14fc2-kube-api-access-gg9fz\") pod \"maas-controller-5dc44bd94b-cct56\" (UID: \"91342866-b823-44bb-b2fe-98e443f14fc2\") " pod="opendatahub/maas-controller-5dc44bd94b-cct56" Apr 21 04:35:59.768396 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.768351 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5dc44bd94b-cct56" Apr 21 04:35:59.890387 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:35:59.890355 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5dc44bd94b-cct56"] Apr 21 04:35:59.893433 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:35:59.893407 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91342866_b823_44bb_b2fe_98e443f14fc2.slice/crio-b47e9533771db6cdb7c09f2ef077e9dd6cf9a9fdff54d483cad891076ce4551c WatchSource:0}: Error finding container b47e9533771db6cdb7c09f2ef077e9dd6cf9a9fdff54d483cad891076ce4551c: Status 404 returned error can't find the container with id b47e9533771db6cdb7c09f2ef077e9dd6cf9a9fdff54d483cad891076ce4551c Apr 21 04:36:00.149669 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:00.149631 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5dc44bd94b-cct56" event={"ID":"91342866-b823-44bb-b2fe-98e443f14fc2","Type":"ContainerStarted","Data":"b47e9533771db6cdb7c09f2ef077e9dd6cf9a9fdff54d483cad891076ce4551c"} Apr 21 04:36:00.150605 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:00.150575 2574 generic.go:358] "Generic (PLEG): container finished" podID="deb6fbd1-3b26-434a-a1d3-aa49261e0d90" containerID="097bc530625a9a00ad09def5a1e954660c2b9657c88da4fb314e92b34c79d406" exitCode=0 Apr 21 04:36:00.150742 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:00.150631 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-847968d9df-tnrwh" event={"ID":"deb6fbd1-3b26-434a-a1d3-aa49261e0d90","Type":"ContainerDied","Data":"097bc530625a9a00ad09def5a1e954660c2b9657c88da4fb314e92b34c79d406"} Apr 21 04:36:00.150742 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:00.150638 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-847968d9df-tnrwh" Apr 21 04:36:00.150742 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:00.150660 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-847968d9df-tnrwh" event={"ID":"deb6fbd1-3b26-434a-a1d3-aa49261e0d90","Type":"ContainerDied","Data":"af8fdb927a1c106d85667ee49ee27f0d7c2a1b9134a075c8b1762f423b677c15"} Apr 21 04:36:00.150742 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:00.150678 2574 scope.go:117] "RemoveContainer" containerID="097bc530625a9a00ad09def5a1e954660c2b9657c88da4fb314e92b34c79d406" Apr 21 04:36:00.160686 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:00.160669 2574 scope.go:117] "RemoveContainer" containerID="097bc530625a9a00ad09def5a1e954660c2b9657c88da4fb314e92b34c79d406" Apr 21 04:36:00.160945 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:36:00.160928 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"097bc530625a9a00ad09def5a1e954660c2b9657c88da4fb314e92b34c79d406\": container with ID starting with 097bc530625a9a00ad09def5a1e954660c2b9657c88da4fb314e92b34c79d406 not found: ID does not exist" containerID="097bc530625a9a00ad09def5a1e954660c2b9657c88da4fb314e92b34c79d406" Apr 21 04:36:00.161000 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:00.160953 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"097bc530625a9a00ad09def5a1e954660c2b9657c88da4fb314e92b34c79d406"} err="failed to get container status \"097bc530625a9a00ad09def5a1e954660c2b9657c88da4fb314e92b34c79d406\": rpc error: code = NotFound desc = could not find container \"097bc530625a9a00ad09def5a1e954660c2b9657c88da4fb314e92b34c79d406\": container with ID starting with 097bc530625a9a00ad09def5a1e954660c2b9657c88da4fb314e92b34c79d406 not found: ID does not exist" Apr 21 04:36:00.167401 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:00.167376 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-847968d9df-tnrwh"] Apr 21 04:36:00.170943 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:00.170923 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-847968d9df-tnrwh"] Apr 21 04:36:01.155925 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:01.155888 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5dc44bd94b-cct56" event={"ID":"91342866-b823-44bb-b2fe-98e443f14fc2","Type":"ContainerStarted","Data":"263676fd953cbe901e90f4fa095a6a5b676a79540009b3c84eb52f13ab01e9b0"} Apr 21 04:36:01.156377 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:01.155945 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-5dc44bd94b-cct56" Apr 21 04:36:01.173505 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:01.173460 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-5dc44bd94b-cct56" podStartSLOduration=1.781007186 podStartE2EDuration="2.173447407s" podCreationTimestamp="2026-04-21 04:35:59 +0000 UTC" firstStartedPulling="2026-04-21 04:35:59.894643935 +0000 UTC m=+738.754747895" lastFinishedPulling="2026-04-21 04:36:00.287084156 +0000 UTC m=+739.147188116" observedRunningTime="2026-04-21 04:36:01.171016065 +0000 UTC m=+740.031120055" watchObservedRunningTime="2026-04-21 04:36:01.173447407 +0000 UTC m=+740.033551385" Apr 21 04:36:01.691934 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:01.691889 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb6fbd1-3b26-434a-a1d3-aa49261e0d90" path="/var/lib/kubelet/pods/deb6fbd1-3b26-434a-a1d3-aa49261e0d90/volumes" Apr 21 04:36:12.165265 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:12.165227 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-5dc44bd94b-cct56" Apr 21 04:36:12.204841 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:12.204814 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-f6f645c74-x5m9b"] Apr 21 04:36:12.205074 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:12.205045 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-f6f645c74-x5m9b" podUID="531f8c1b-7669-482b-997a-314d224aed15" containerName="manager" containerID="cri-o://c80d95174dd23accc51add34518db4c08555a5e10871e9b1f8d9e1a748298466" gracePeriod=10 Apr 21 04:36:12.447541 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:12.447517 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f6f645c74-x5m9b" Apr 21 04:36:12.594982 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:12.594944 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz4nq\" (UniqueName: \"kubernetes.io/projected/531f8c1b-7669-482b-997a-314d224aed15-kube-api-access-qz4nq\") pod \"531f8c1b-7669-482b-997a-314d224aed15\" (UID: \"531f8c1b-7669-482b-997a-314d224aed15\") " Apr 21 04:36:12.597041 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:12.597004 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/531f8c1b-7669-482b-997a-314d224aed15-kube-api-access-qz4nq" (OuterVolumeSpecName: "kube-api-access-qz4nq") pod "531f8c1b-7669-482b-997a-314d224aed15" (UID: "531f8c1b-7669-482b-997a-314d224aed15"). InnerVolumeSpecName "kube-api-access-qz4nq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:36:12.695988 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:12.695896 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qz4nq\" (UniqueName: \"kubernetes.io/projected/531f8c1b-7669-482b-997a-314d224aed15-kube-api-access-qz4nq\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:36:13.202098 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:13.202063 2574 generic.go:358] "Generic (PLEG): container finished" podID="531f8c1b-7669-482b-997a-314d224aed15" containerID="c80d95174dd23accc51add34518db4c08555a5e10871e9b1f8d9e1a748298466" exitCode=0 Apr 21 04:36:13.202638 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:13.202121 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f6f645c74-x5m9b" Apr 21 04:36:13.202638 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:13.202151 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f6f645c74-x5m9b" event={"ID":"531f8c1b-7669-482b-997a-314d224aed15","Type":"ContainerDied","Data":"c80d95174dd23accc51add34518db4c08555a5e10871e9b1f8d9e1a748298466"} Apr 21 04:36:13.202638 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:13.202200 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f6f645c74-x5m9b" event={"ID":"531f8c1b-7669-482b-997a-314d224aed15","Type":"ContainerDied","Data":"e480d5094a5a7948f7189dfc43df4e1ddc6fd8fd0059201acd84e24a19a61823"} Apr 21 04:36:13.202638 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:13.202221 2574 scope.go:117] "RemoveContainer" containerID="c80d95174dd23accc51add34518db4c08555a5e10871e9b1f8d9e1a748298466" Apr 21 04:36:13.211656 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:13.211516 2574 scope.go:117] "RemoveContainer" containerID="c80d95174dd23accc51add34518db4c08555a5e10871e9b1f8d9e1a748298466" Apr 21 04:36:13.211789 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:36:13.211771 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c80d95174dd23accc51add34518db4c08555a5e10871e9b1f8d9e1a748298466\": container with ID starting with c80d95174dd23accc51add34518db4c08555a5e10871e9b1f8d9e1a748298466 not found: ID does not exist" containerID="c80d95174dd23accc51add34518db4c08555a5e10871e9b1f8d9e1a748298466" Apr 21 04:36:13.211837 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:13.211799 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80d95174dd23accc51add34518db4c08555a5e10871e9b1f8d9e1a748298466"} err="failed to get container status \"c80d95174dd23accc51add34518db4c08555a5e10871e9b1f8d9e1a748298466\": rpc error: code = NotFound desc = could not find container \"c80d95174dd23accc51add34518db4c08555a5e10871e9b1f8d9e1a748298466\": container with ID starting with c80d95174dd23accc51add34518db4c08555a5e10871e9b1f8d9e1a748298466 not found: ID does not exist" Apr 21 04:36:13.223241 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:13.223209 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-f6f645c74-x5m9b"] Apr 21 04:36:13.229086 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:13.229063 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-f6f645c74-x5m9b"] Apr 21 04:36:13.687982 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:13.687950 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="531f8c1b-7669-482b-997a-314d224aed15" path="/var/lib/kubelet/pods/531f8c1b-7669-482b-997a-314d224aed15/volumes" Apr 21 04:36:16.093196 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:16.093158 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:36:16.093683 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:16.093449 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="keycloak-system/maas-keycloak-0" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" containerID="cri-o://6d0e5fd673454f3a05263361ca6f05ad50c290ddb2be40e4ff6f8cd7c484a244" gracePeriod=30 Apr 21 04:36:18.126497 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.126474 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 04:36:18.225829 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.225799 2574 generic.go:358] "Generic (PLEG): container finished" podID="f32d77a6-439c-4112-8b10-14a28180dd82" containerID="6d0e5fd673454f3a05263361ca6f05ad50c290ddb2be40e4ff6f8cd7c484a244" exitCode=143 Apr 21 04:36:18.225997 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.225855 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 04:36:18.225997 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.225885 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"f32d77a6-439c-4112-8b10-14a28180dd82","Type":"ContainerDied","Data":"6d0e5fd673454f3a05263361ca6f05ad50c290ddb2be40e4ff6f8cd7c484a244"} Apr 21 04:36:18.225997 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.225925 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"f32d77a6-439c-4112-8b10-14a28180dd82","Type":"ContainerDied","Data":"9dca34a5f629c12989dc6328be716ce311223a7a22576339310679825de2a243"} Apr 21 04:36:18.225997 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.225943 2574 scope.go:117] "RemoveContainer" containerID="6d0e5fd673454f3a05263361ca6f05ad50c290ddb2be40e4ff6f8cd7c484a244" Apr 21 04:36:18.234967 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.234936 2574 scope.go:117] "RemoveContainer" containerID="6d0e5fd673454f3a05263361ca6f05ad50c290ddb2be40e4ff6f8cd7c484a244" Apr 21 04:36:18.235212 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:36:18.235191 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d0e5fd673454f3a05263361ca6f05ad50c290ddb2be40e4ff6f8cd7c484a244\": container with ID starting with 6d0e5fd673454f3a05263361ca6f05ad50c290ddb2be40e4ff6f8cd7c484a244 not found: ID does not exist" containerID="6d0e5fd673454f3a05263361ca6f05ad50c290ddb2be40e4ff6f8cd7c484a244" Apr 21 04:36:18.235289 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.235219 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0e5fd673454f3a05263361ca6f05ad50c290ddb2be40e4ff6f8cd7c484a244"} err="failed to get container status \"6d0e5fd673454f3a05263361ca6f05ad50c290ddb2be40e4ff6f8cd7c484a244\": rpc error: code = NotFound desc = could not find container \"6d0e5fd673454f3a05263361ca6f05ad50c290ddb2be40e4ff6f8cd7c484a244\": container with ID starting with 6d0e5fd673454f3a05263361ca6f05ad50c290ddb2be40e4ff6f8cd7c484a244 not found: ID does not exist" Apr 21 04:36:18.241480 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.241451 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-456gs\" (UniqueName: \"kubernetes.io/projected/f32d77a6-439c-4112-8b10-14a28180dd82-kube-api-access-456gs\") pod \"f32d77a6-439c-4112-8b10-14a28180dd82\" (UID: \"f32d77a6-439c-4112-8b10-14a28180dd82\") " Apr 21 04:36:18.243453 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.243426 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f32d77a6-439c-4112-8b10-14a28180dd82-kube-api-access-456gs" (OuterVolumeSpecName: "kube-api-access-456gs") pod "f32d77a6-439c-4112-8b10-14a28180dd82" (UID: "f32d77a6-439c-4112-8b10-14a28180dd82"). InnerVolumeSpecName "kube-api-access-456gs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:36:18.342777 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.342707 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-456gs\" (UniqueName: \"kubernetes.io/projected/f32d77a6-439c-4112-8b10-14a28180dd82-kube-api-access-456gs\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:36:18.547452 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.547424 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:36:18.550213 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.550191 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:36:18.575031 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.572876 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:36:18.575031 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.573522 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="531f8c1b-7669-482b-997a-314d224aed15" containerName="manager" Apr 21 04:36:18.575031 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.573540 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="531f8c1b-7669-482b-997a-314d224aed15" containerName="manager" Apr 21 04:36:18.575031 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.573570 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" Apr 21 04:36:18.575031 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.573578 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" Apr 21 04:36:18.575031 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.573738 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" containerName="keycloak" Apr 21 04:36:18.575031 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.573756 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="531f8c1b-7669-482b-997a-314d224aed15" containerName="manager" Apr 21 04:36:18.579274 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.579245 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 04:36:18.582187 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.582045 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"keycloak-test-realms\"" Apr 21 04:36:18.582187 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.582063 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-p2s9m\"" Apr 21 04:36:18.582187 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.582087 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 21 04:36:18.582471 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.582392 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 04:36:18.582586 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.582566 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 04:36:18.583176 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.583154 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:36:18.645650 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.645616 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/9b8ad1b1-32dd-4a93-ae5c-0609853f2893-test-realms\") pod \"maas-keycloak-0\" (UID: \"9b8ad1b1-32dd-4a93-ae5c-0609853f2893\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:36:18.645809 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.645699 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtdhn\" (UniqueName: \"kubernetes.io/projected/9b8ad1b1-32dd-4a93-ae5c-0609853f2893-kube-api-access-xtdhn\") pod \"maas-keycloak-0\" (UID: \"9b8ad1b1-32dd-4a93-ae5c-0609853f2893\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:36:18.746744 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.746681 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtdhn\" (UniqueName: \"kubernetes.io/projected/9b8ad1b1-32dd-4a93-ae5c-0609853f2893-kube-api-access-xtdhn\") pod \"maas-keycloak-0\" (UID: \"9b8ad1b1-32dd-4a93-ae5c-0609853f2893\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:36:18.746898 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.746780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/9b8ad1b1-32dd-4a93-ae5c-0609853f2893-test-realms\") pod \"maas-keycloak-0\" (UID: \"9b8ad1b1-32dd-4a93-ae5c-0609853f2893\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:36:18.747491 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.747471 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/9b8ad1b1-32dd-4a93-ae5c-0609853f2893-test-realms\") pod \"maas-keycloak-0\" (UID: \"9b8ad1b1-32dd-4a93-ae5c-0609853f2893\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:36:18.755283 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.755251 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtdhn\" (UniqueName: \"kubernetes.io/projected/9b8ad1b1-32dd-4a93-ae5c-0609853f2893-kube-api-access-xtdhn\") pod \"maas-keycloak-0\" (UID: \"9b8ad1b1-32dd-4a93-ae5c-0609853f2893\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:36:18.890054 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:18.889972 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 04:36:19.013321 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:19.013295 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:36:19.014845 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:36:19.014816 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b8ad1b1_32dd_4a93_ae5c_0609853f2893.slice/crio-c16b8bac98310d44f6917c8ad0d8a11df5a12352700818f198d3098e54dc05fe WatchSource:0}: Error finding container c16b8bac98310d44f6917c8ad0d8a11df5a12352700818f198d3098e54dc05fe: Status 404 returned error can't find the container with id c16b8bac98310d44f6917c8ad0d8a11df5a12352700818f198d3098e54dc05fe Apr 21 04:36:19.231031 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:19.230940 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"9b8ad1b1-32dd-4a93-ae5c-0609853f2893","Type":"ContainerStarted","Data":"c16b8bac98310d44f6917c8ad0d8a11df5a12352700818f198d3098e54dc05fe"} Apr 21 04:36:19.688404 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:19.688361 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f32d77a6-439c-4112-8b10-14a28180dd82" path="/var/lib/kubelet/pods/f32d77a6-439c-4112-8b10-14a28180dd82/volumes" Apr 21 04:36:20.237807 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:20.237765 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"9b8ad1b1-32dd-4a93-ae5c-0609853f2893","Type":"ContainerStarted","Data":"ea852f802071f97a650d86b650f80c5171f99e4d7f8f38bcb4f9155f07f794dc"} Apr 21 04:36:20.254751 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:20.254660 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=1.9458059890000001 podStartE2EDuration="2.254638572s" podCreationTimestamp="2026-04-21 04:36:18 +0000 UTC" firstStartedPulling="2026-04-21 04:36:19.016172093 +0000 UTC m=+757.876276050" lastFinishedPulling="2026-04-21 04:36:19.325004658 +0000 UTC m=+758.185108633" observedRunningTime="2026-04-21 04:36:20.25367349 +0000 UTC m=+759.113777470" watchObservedRunningTime="2026-04-21 04:36:20.254638572 +0000 UTC m=+759.114742552" Apr 21 04:36:20.890357 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:20.890312 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 21 04:36:20.892110 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:20.892072 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="9b8ad1b1-32dd-4a93-ae5c-0609853f2893" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 21 04:36:21.891132 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:21.891066 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="9b8ad1b1-32dd-4a93-ae5c-0609853f2893" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 21 04:36:22.891037 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:22.890977 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="9b8ad1b1-32dd-4a93-ae5c-0609853f2893" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 21 04:36:23.891037 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:23.890985 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="9b8ad1b1-32dd-4a93-ae5c-0609853f2893" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 21 04:36:24.890599 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:24.890544 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="9b8ad1b1-32dd-4a93-ae5c-0609853f2893" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 21 04:36:25.891210 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:25.891154 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="9b8ad1b1-32dd-4a93-ae5c-0609853f2893" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 21 04:36:26.890953 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:26.890906 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="9b8ad1b1-32dd-4a93-ae5c-0609853f2893" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 21 04:36:27.891077 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:27.891028 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="9b8ad1b1-32dd-4a93-ae5c-0609853f2893" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 21 04:36:28.890324 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:28.890285 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 21 04:36:28.890704 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:28.890668 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="9b8ad1b1-32dd-4a93-ae5c-0609853f2893" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 21 04:36:29.890572 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:29.890521 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="9b8ad1b1-32dd-4a93-ae5c-0609853f2893" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 21 04:36:30.891318 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:30.891263 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="9b8ad1b1-32dd-4a93-ae5c-0609853f2893" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 21 04:36:31.890366 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:31.890319 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="9b8ad1b1-32dd-4a93-ae5c-0609853f2893" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 21 04:36:33.017710 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:33.017677 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 21 04:36:33.036917 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:33.036866 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="9b8ad1b1-32dd-4a93-ae5c-0609853f2893" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:36:43.023775 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:43.023706 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 21 04:36:54.880134 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:54.880061 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-85599cd679-w2cl5"] Apr 21 04:36:54.881102 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:54.881044 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-85599cd679-w2cl5" podUID="afb9faad-14c7-4f6b-be77-109601000770" containerName="authorino" containerID="cri-o://d8fd040005525e4d02b820e982ad74c02124c70b11c620dd8adf9d07cbfafb8f" gracePeriod=30 Apr 21 04:36:55.150699 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:55.150674 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85599cd679-w2cl5" Apr 21 04:36:55.282030 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:55.281994 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/afb9faad-14c7-4f6b-be77-109601000770-tls-cert\") pod \"afb9faad-14c7-4f6b-be77-109601000770\" (UID: \"afb9faad-14c7-4f6b-be77-109601000770\") " Apr 21 04:36:55.282030 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:55.282038 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4sk4\" (UniqueName: \"kubernetes.io/projected/afb9faad-14c7-4f6b-be77-109601000770-kube-api-access-d4sk4\") pod \"afb9faad-14c7-4f6b-be77-109601000770\" (UID: \"afb9faad-14c7-4f6b-be77-109601000770\") " Apr 21 04:36:55.284028 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:55.284003 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb9faad-14c7-4f6b-be77-109601000770-kube-api-access-d4sk4" (OuterVolumeSpecName: "kube-api-access-d4sk4") pod "afb9faad-14c7-4f6b-be77-109601000770" (UID: "afb9faad-14c7-4f6b-be77-109601000770"). InnerVolumeSpecName "kube-api-access-d4sk4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:36:55.292045 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:55.292007 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb9faad-14c7-4f6b-be77-109601000770-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "afb9faad-14c7-4f6b-be77-109601000770" (UID: "afb9faad-14c7-4f6b-be77-109601000770"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:36:55.383585 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:55.383551 2574 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/afb9faad-14c7-4f6b-be77-109601000770-tls-cert\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:36:55.383585 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:55.383583 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d4sk4\" (UniqueName: \"kubernetes.io/projected/afb9faad-14c7-4f6b-be77-109601000770-kube-api-access-d4sk4\") on node \"ip-10-0-140-163.ec2.internal\" DevicePath \"\"" Apr 21 04:36:55.393395 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:55.393324 2574 generic.go:358] "Generic (PLEG): container finished" podID="afb9faad-14c7-4f6b-be77-109601000770" containerID="d8fd040005525e4d02b820e982ad74c02124c70b11c620dd8adf9d07cbfafb8f" exitCode=0 Apr 21 04:36:55.393395 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:55.393377 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85599cd679-w2cl5" Apr 21 04:36:55.393543 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:55.393411 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85599cd679-w2cl5" event={"ID":"afb9faad-14c7-4f6b-be77-109601000770","Type":"ContainerDied","Data":"d8fd040005525e4d02b820e982ad74c02124c70b11c620dd8adf9d07cbfafb8f"} Apr 21 04:36:55.393543 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:55.393449 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85599cd679-w2cl5" event={"ID":"afb9faad-14c7-4f6b-be77-109601000770","Type":"ContainerDied","Data":"1c74d6d8d472f80ac8c6e5ad488a2568a3c43cca4402f8ac3f93b18309c434bc"} Apr 21 04:36:55.393543 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:55.393465 2574 scope.go:117] "RemoveContainer" containerID="d8fd040005525e4d02b820e982ad74c02124c70b11c620dd8adf9d07cbfafb8f" Apr 21 04:36:55.402487 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:55.402471 2574 scope.go:117] "RemoveContainer" containerID="d8fd040005525e4d02b820e982ad74c02124c70b11c620dd8adf9d07cbfafb8f" Apr 21 04:36:55.402750 ip-10-0-140-163 kubenswrapper[2574]: E0421 04:36:55.402710 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8fd040005525e4d02b820e982ad74c02124c70b11c620dd8adf9d07cbfafb8f\": container with ID starting with d8fd040005525e4d02b820e982ad74c02124c70b11c620dd8adf9d07cbfafb8f not found: ID does not exist" containerID="d8fd040005525e4d02b820e982ad74c02124c70b11c620dd8adf9d07cbfafb8f" Apr 21 04:36:55.402818 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:55.402758 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8fd040005525e4d02b820e982ad74c02124c70b11c620dd8adf9d07cbfafb8f"} err="failed to get container status \"d8fd040005525e4d02b820e982ad74c02124c70b11c620dd8adf9d07cbfafb8f\": rpc error: code = NotFound desc = could not find container \"d8fd040005525e4d02b820e982ad74c02124c70b11c620dd8adf9d07cbfafb8f\": container with ID starting with d8fd040005525e4d02b820e982ad74c02124c70b11c620dd8adf9d07cbfafb8f not found: ID does not exist" Apr 21 04:36:55.414591 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:55.414566 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-85599cd679-w2cl5"] Apr 21 04:36:55.418656 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:55.418636 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-85599cd679-w2cl5"] Apr 21 04:36:55.687024 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:36:55.686948 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb9faad-14c7-4f6b-be77-109601000770" path="/var/lib/kubelet/pods/afb9faad-14c7-4f6b-be77-109601000770/volumes" Apr 21 04:37:08.428562 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.428483 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9"] Apr 21 04:37:08.428946 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.428831 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="afb9faad-14c7-4f6b-be77-109601000770" containerName="authorino" Apr 21 04:37:08.428946 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.428844 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb9faad-14c7-4f6b-be77-109601000770" containerName="authorino" Apr 21 04:37:08.428946 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.428897 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="afb9faad-14c7-4f6b-be77-109601000770" containerName="authorino" Apr 21 04:37:08.432262 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.432247 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.434513 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.434493 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 21 04:37:08.435328 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.435310 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-569jl\"" Apr 21 04:37:08.435428 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.435348 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 21 04:37:08.435525 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.435505 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 21 04:37:08.441061 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.441036 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9"] Apr 21 04:37:08.590627 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.590595 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.590627 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.590630 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.590888 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.590655 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.590888 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.590751 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.590888 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.590787 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc7cr\" (UniqueName: \"kubernetes.io/projected/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-kube-api-access-wc7cr\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.590888 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.590839 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.692076 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.691976 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.692076 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.692071 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.692276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.692090 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.692276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.692106 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.692276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.692133 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.692276 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.692151 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc7cr\" (UniqueName: \"kubernetes.io/projected/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-kube-api-access-wc7cr\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.692557 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.692533 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.692557 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.692548 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.692682 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.692601 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.694259 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.694237 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.694584 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.694566 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.700006 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.699984 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc7cr\" (UniqueName: \"kubernetes.io/projected/42c64b83-9e5f-4988-ab83-c5e75f0ac8d6-kube-api-access-wc7cr\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9\" (UID: \"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.743769 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.743704 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:08.870536 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:08.870501 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9"] Apr 21 04:37:08.872807 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:37:08.872778 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42c64b83_9e5f_4988_ab83_c5e75f0ac8d6.slice/crio-ea6ef9e711318b4d19d7fdfaffb4e04a689d094e2efc852646cd40ada9cdeaba WatchSource:0}: Error finding container ea6ef9e711318b4d19d7fdfaffb4e04a689d094e2efc852646cd40ada9cdeaba: Status 404 returned error can't find the container with id ea6ef9e711318b4d19d7fdfaffb4e04a689d094e2efc852646cd40ada9cdeaba Apr 21 04:37:09.445717 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:09.445679 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" event={"ID":"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6","Type":"ContainerStarted","Data":"ea6ef9e711318b4d19d7fdfaffb4e04a689d094e2efc852646cd40ada9cdeaba"} Apr 21 04:37:11.925845 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:11.925805 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks"] Apr 21 04:37:11.929380 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:11.929364 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:11.931547 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:11.931529 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 21 04:37:11.937184 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:11.936916 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks"] Apr 21 04:37:12.018600 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.018568 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.018800 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.018614 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqkjz\" (UniqueName: \"kubernetes.io/projected/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-kube-api-access-dqkjz\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.018800 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.018635 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.018800 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.018747 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.018800 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.018778 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.018959 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.018813 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.119797 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.119762 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqkjz\" (UniqueName: \"kubernetes.io/projected/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-kube-api-access-dqkjz\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.119981 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.119806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.119981 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.119869 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.119981 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.119893 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.119981 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.119930 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.120197 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.119994 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.120463 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.120437 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.120551 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.120470 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.120551 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.120507 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.122465 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.122440 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.122716 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.122698 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.127546 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.127519 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqkjz\" (UniqueName: \"kubernetes.io/projected/e30cbfd7-8690-4038-9253-ad3a3dc1b9b6-kube-api-access-dqkjz\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks\" (UID: \"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.241579 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.241485 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:12.369389 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.369360 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks"] Apr 21 04:37:12.371072 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:37:12.371046 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode30cbfd7_8690_4038_9253_ad3a3dc1b9b6.slice/crio-b0cbab4476b8936672b3f86d62f2b4d85db0de271ccc7cc3ad77ea1875305e08 WatchSource:0}: Error finding container b0cbab4476b8936672b3f86d62f2b4d85db0de271ccc7cc3ad77ea1875305e08: Status 404 returned error can't find the container with id b0cbab4476b8936672b3f86d62f2b4d85db0de271ccc7cc3ad77ea1875305e08 Apr 21 04:37:12.457315 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:12.457272 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" event={"ID":"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6","Type":"ContainerStarted","Data":"b0cbab4476b8936672b3f86d62f2b4d85db0de271ccc7cc3ad77ea1875305e08"} Apr 21 04:37:15.470181 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:15.470126 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" event={"ID":"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6","Type":"ContainerStarted","Data":"abee63d45a0fa76e781b20bf87a383766ebb21c6482a2642728ddd2356b5edcb"} Apr 21 04:37:15.471627 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:15.471597 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" event={"ID":"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6","Type":"ContainerStarted","Data":"15f99ad048715f1c4930f92cf1015edf099e6027b3b0e34ef017c515201ee60d"} Apr 21 04:37:19.614911 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.614868 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8"] Apr 21 04:37:19.618495 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.618474 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.620811 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.620788 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 21 04:37:19.628884 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.628861 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8"] Apr 21 04:37:19.793905 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.793870 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7blrq\" (UniqueName: \"kubernetes.io/projected/d9faf4bf-e661-4697-a01f-96e8a0523bb8-kube-api-access-7blrq\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.793905 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.793909 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9faf4bf-e661-4697-a01f-96e8a0523bb8-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.794120 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.793940 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9faf4bf-e661-4697-a01f-96e8a0523bb8-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.794120 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.794025 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9faf4bf-e661-4697-a01f-96e8a0523bb8-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.794185 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.794139 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9faf4bf-e661-4697-a01f-96e8a0523bb8-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.794185 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.794179 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9faf4bf-e661-4697-a01f-96e8a0523bb8-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.894994 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.894901 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7blrq\" (UniqueName: \"kubernetes.io/projected/d9faf4bf-e661-4697-a01f-96e8a0523bb8-kube-api-access-7blrq\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.894994 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.894964 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9faf4bf-e661-4697-a01f-96e8a0523bb8-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.895225 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.895007 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9faf4bf-e661-4697-a01f-96e8a0523bb8-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.895225 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.895059 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9faf4bf-e661-4697-a01f-96e8a0523bb8-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.895225 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.895196 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9faf4bf-e661-4697-a01f-96e8a0523bb8-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.895385 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.895238 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9faf4bf-e661-4697-a01f-96e8a0523bb8-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.895385 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.895353 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9faf4bf-e661-4697-a01f-96e8a0523bb8-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.895568 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.895544 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9faf4bf-e661-4697-a01f-96e8a0523bb8-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.895609 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.895588 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9faf4bf-e661-4697-a01f-96e8a0523bb8-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.897566 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.897544 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9faf4bf-e661-4697-a01f-96e8a0523bb8-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.897978 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.897962 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9faf4bf-e661-4697-a01f-96e8a0523bb8-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.902404 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.902384 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7blrq\" (UniqueName: \"kubernetes.io/projected/d9faf4bf-e661-4697-a01f-96e8a0523bb8-kube-api-access-7blrq\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8\" (UID: \"d9faf4bf-e661-4697-a01f-96e8a0523bb8\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:19.929349 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:19.929317 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:20.071711 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:20.071680 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8"] Apr 21 04:37:20.495487 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:20.495424 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" event={"ID":"d9faf4bf-e661-4697-a01f-96e8a0523bb8","Type":"ContainerStarted","Data":"ae375d04c0b860091ceb966f3e12882223e040416e2150c7f45a4780d536b681"} Apr 21 04:37:20.495487 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:20.495480 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" event={"ID":"d9faf4bf-e661-4697-a01f-96e8a0523bb8","Type":"ContainerStarted","Data":"1a0e2931d92e6656b158bb74f2049f83e52aa04bd6c2f0d5b488dd34a87764e4"} Apr 21 04:37:21.501288 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:21.501250 2574 generic.go:358] "Generic (PLEG): container finished" podID="e30cbfd7-8690-4038-9253-ad3a3dc1b9b6" containerID="15f99ad048715f1c4930f92cf1015edf099e6027b3b0e34ef017c515201ee60d" exitCode=0 Apr 21 04:37:21.501695 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:21.501322 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" event={"ID":"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6","Type":"ContainerDied","Data":"15f99ad048715f1c4930f92cf1015edf099e6027b3b0e34ef017c515201ee60d"} Apr 21 04:37:21.502920 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:21.502896 2574 generic.go:358] "Generic (PLEG): container finished" podID="42c64b83-9e5f-4988-ab83-c5e75f0ac8d6" containerID="abee63d45a0fa76e781b20bf87a383766ebb21c6482a2642728ddd2356b5edcb" exitCode=0 Apr 21 04:37:21.502995 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:21.502965 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" event={"ID":"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6","Type":"ContainerDied","Data":"abee63d45a0fa76e781b20bf87a383766ebb21c6482a2642728ddd2356b5edcb"} Apr 21 04:37:23.513697 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:23.513654 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" event={"ID":"42c64b83-9e5f-4988-ab83-c5e75f0ac8d6","Type":"ContainerStarted","Data":"007c780f9994b635dae1756141a0ed40dd6c19665904fa9125bc3c8c2e0d0dba"} Apr 21 04:37:23.514141 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:23.513941 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:23.515287 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:23.515265 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" event={"ID":"e30cbfd7-8690-4038-9253-ad3a3dc1b9b6","Type":"ContainerStarted","Data":"c1138fee3c555e7270563a9827d96ab8c9c8eb9d50e9c70d163d37ade1c142b8"} Apr 21 04:37:23.515469 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:23.515449 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:23.531639 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:23.531594 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" podStartSLOduration=1.823906472 podStartE2EDuration="15.531582058s" podCreationTimestamp="2026-04-21 04:37:08 +0000 UTC" firstStartedPulling="2026-04-21 04:37:08.875110867 +0000 UTC m=+807.735214827" lastFinishedPulling="2026-04-21 04:37:22.582786453 +0000 UTC m=+821.442890413" observedRunningTime="2026-04-21 04:37:23.529657997 +0000 UTC m=+822.389762000" watchObservedRunningTime="2026-04-21 04:37:23.531582058 +0000 UTC m=+822.391686036" Apr 21 04:37:23.547030 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:23.546987 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" podStartSLOduration=2.332549978 podStartE2EDuration="12.546975832s" podCreationTimestamp="2026-04-21 04:37:11 +0000 UTC" firstStartedPulling="2026-04-21 04:37:12.373573264 +0000 UTC m=+811.233677222" lastFinishedPulling="2026-04-21 04:37:22.587999112 +0000 UTC m=+821.448103076" observedRunningTime="2026-04-21 04:37:23.546068495 +0000 UTC m=+822.406172477" watchObservedRunningTime="2026-04-21 04:37:23.546975832 +0000 UTC m=+822.407079810" Apr 21 04:37:26.530046 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:26.530012 2574 generic.go:358] "Generic (PLEG): container finished" podID="d9faf4bf-e661-4697-a01f-96e8a0523bb8" containerID="ae375d04c0b860091ceb966f3e12882223e040416e2150c7f45a4780d536b681" exitCode=0 Apr 21 04:37:26.530513 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:26.530086 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" event={"ID":"d9faf4bf-e661-4697-a01f-96e8a0523bb8","Type":"ContainerDied","Data":"ae375d04c0b860091ceb966f3e12882223e040416e2150c7f45a4780d536b681"} Apr 21 04:37:27.535623 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:27.535590 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" event={"ID":"d9faf4bf-e661-4697-a01f-96e8a0523bb8","Type":"ContainerStarted","Data":"b170f2686be3cfabbedcea27fe0732dafd82a366b0c33fe69f98a69279886b1e"} Apr 21 04:37:27.536143 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:27.535848 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:37:27.554288 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:27.554235 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" podStartSLOduration=8.363431908999999 podStartE2EDuration="8.554220752s" podCreationTimestamp="2026-04-21 04:37:19 +0000 UTC" firstStartedPulling="2026-04-21 04:37:26.53071057 +0000 UTC m=+825.390814527" lastFinishedPulling="2026-04-21 04:37:26.721499395 +0000 UTC m=+825.581603370" observedRunningTime="2026-04-21 04:37:27.551589217 +0000 UTC m=+826.411693195" watchObservedRunningTime="2026-04-21 04:37:27.554220752 +0000 UTC m=+826.414324738" Apr 21 04:37:34.534363 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:34.534329 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks" Apr 21 04:37:34.535064 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:34.535043 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9" Apr 21 04:37:38.552387 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:37:38.552354 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8" Apr 21 04:38:41.675880 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:38:41.675799 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/ovn-acl-logging/0.log" Apr 21 04:38:41.676384 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:38:41.676194 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/ovn-acl-logging/0.log" Apr 21 04:39:16.891656 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:16.891627 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-5dc44bd94b-cct56_91342866-b823-44bb-b2fe-98e443f14fc2/manager/0.log" Apr 21 04:39:17.331313 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:17.331284 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5b6f69cdb8-wsldk_bac42821-e633-4da9-bb86-c0f282c8a751/manager/0.log" Apr 21 04:39:17.441702 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:17.441669 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-ccgk7_0fe159dc-e5ab-451b-a362-091d22df0bb9/postgres/0.log" Apr 21 04:39:18.193538 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:18.193503 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf_be30dc22-b0e8-468c-92aa-a37020853240/util/0.log" Apr 21 04:39:18.199543 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:18.199518 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf_be30dc22-b0e8-468c-92aa-a37020853240/pull/0.log" Apr 21 04:39:18.205261 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:18.205238 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf_be30dc22-b0e8-468c-92aa-a37020853240/extract/0.log" Apr 21 04:39:18.313965 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:18.313934 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4_a6ec1d15-a3d0-4783-811b-3f22bdfba652/util/0.log" Apr 21 04:39:18.320192 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:18.320157 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4_a6ec1d15-a3d0-4783-811b-3f22bdfba652/pull/0.log" Apr 21 04:39:18.326231 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:18.326200 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4_a6ec1d15-a3d0-4783-811b-3f22bdfba652/extract/0.log" Apr 21 04:39:18.430949 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:18.430922 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs_73b4592f-9384-4b6f-a122-ad4317c5ffb0/util/0.log" Apr 21 04:39:18.437822 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:18.437798 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs_73b4592f-9384-4b6f-a122-ad4317c5ffb0/pull/0.log" Apr 21 04:39:18.443950 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:18.443856 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs_73b4592f-9384-4b6f-a122-ad4317c5ffb0/extract/0.log" Apr 21 04:39:18.545029 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:18.544983 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml_28f6b4ba-638e-46c3-9357-9840f2c1faee/pull/0.log" Apr 21 04:39:18.550300 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:18.550277 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml_28f6b4ba-638e-46c3-9357-9840f2c1faee/extract/0.log" Apr 21 04:39:18.555604 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:18.555585 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml_28f6b4ba-638e-46c3-9357-9840f2c1faee/util/0.log" Apr 21 04:39:18.882770 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:18.882740 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-7d4tm_62de46c9-dd26-484d-a95a-9a697df9677c/manager/0.log" Apr 21 04:39:19.100608 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:19.100578 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-wpbq6_48a4cb8f-9ced-461e-a1fa-779f856430a9/registry-server/0.log" Apr 21 04:39:19.430755 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:19.430714 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-wlnws_0eb69111-0804-4827-97f5-289d251f0d78/manager/0.log" Apr 21 04:39:19.776367 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:19.776276 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6_cbe8fefc-8e21-492e-8023-06c30c0e6259/istio-proxy/0.log" Apr 21 04:39:20.209050 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:20.209007 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-78vcq_00e10d29-e52c-4844-ba3d-64f2e59fd441/istio-proxy/0.log" Apr 21 04:39:20.778901 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:20.778870 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8_d9faf4bf-e661-4697-a01f-96e8a0523bb8/storage-initializer/0.log" Apr 21 04:39:20.786472 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:20.786442 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-4lvb8_d9faf4bf-e661-4697-a01f-96e8a0523bb8/main/0.log" Apr 21 04:39:21.017313 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:21.017282 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks_e30cbfd7-8690-4038-9253-ad3a3dc1b9b6/storage-initializer/0.log" Apr 21 04:39:21.025004 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:21.024962 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcctv2ks_e30cbfd7-8690-4038-9253-ad3a3dc1b9b6/main/0.log" Apr 21 04:39:21.138803 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:21.138772 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9_42c64b83-9e5f-4988-ab83-c5e75f0ac8d6/main/0.log" Apr 21 04:39:21.145328 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:21.145301 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-jbrq9_42c64b83-9e5f-4988-ab83-c5e75f0ac8d6/storage-initializer/0.log" Apr 21 04:39:27.527364 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:27.527333 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-csr9d_ce0750e5-90f6-418d-9510-f807b89c746c/global-pull-secret-syncer/0.log" Apr 21 04:39:27.606989 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:27.606962 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4zx8l_1042efe6-4e47-4044-89ad-3285ece081ef/konnectivity-agent/0.log" Apr 21 04:39:27.726888 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:27.726861 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-163.ec2.internal_3f4b2ba79c5ad3c70dc6c2f14b4ac240/haproxy/0.log" Apr 21 04:39:31.689161 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:31.689128 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf_be30dc22-b0e8-468c-92aa-a37020853240/extract/0.log" Apr 21 04:39:31.711173 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:31.711143 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf_be30dc22-b0e8-468c-92aa-a37020853240/util/0.log" Apr 21 04:39:31.733707 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:31.733682 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759pszzf_be30dc22-b0e8-468c-92aa-a37020853240/pull/0.log" Apr 21 04:39:31.762491 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:31.762462 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4_a6ec1d15-a3d0-4783-811b-3f22bdfba652/extract/0.log" Apr 21 04:39:31.797129 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:31.797103 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4_a6ec1d15-a3d0-4783-811b-3f22bdfba652/util/0.log" Apr 21 04:39:31.827546 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:31.827517 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0xkbm4_a6ec1d15-a3d0-4783-811b-3f22bdfba652/pull/0.log" Apr 21 04:39:31.857871 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:31.857845 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs_73b4592f-9384-4b6f-a122-ad4317c5ffb0/extract/0.log" Apr 21 04:39:31.876848 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:31.876825 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs_73b4592f-9384-4b6f-a122-ad4317c5ffb0/util/0.log" Apr 21 04:39:31.895719 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:31.895699 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73mvfzs_73b4592f-9384-4b6f-a122-ad4317c5ffb0/pull/0.log" Apr 21 04:39:31.917285 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:31.917259 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml_28f6b4ba-638e-46c3-9357-9840f2c1faee/extract/0.log" Apr 21 04:39:31.936832 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:31.936811 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml_28f6b4ba-638e-46c3-9357-9840f2c1faee/util/0.log" Apr 21 04:39:31.964205 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:31.964138 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1cwjml_28f6b4ba-638e-46c3-9357-9840f2c1faee/pull/0.log" Apr 21 04:39:32.046422 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:32.046392 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-7d4tm_62de46c9-dd26-484d-a95a-9a697df9677c/manager/0.log" Apr 21 04:39:32.117488 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:32.117453 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-wpbq6_48a4cb8f-9ced-461e-a1fa-779f856430a9/registry-server/0.log" Apr 21 04:39:32.212065 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:32.212034 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-wlnws_0eb69111-0804-4827-97f5-289d251f0d78/manager/0.log" Apr 21 04:39:34.072883 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:34.072853 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pl7sm_165e36f0-6193-4c39-a29d-a566ebef068d/node-exporter/0.log" Apr 21 04:39:34.091084 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:34.091058 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pl7sm_165e36f0-6193-4c39-a29d-a566ebef068d/kube-rbac-proxy/0.log" Apr 21 04:39:34.108628 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:34.108606 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pl7sm_165e36f0-6193-4c39-a29d-a566ebef068d/init-textfile/0.log" Apr 21 04:39:35.673814 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:35.673778 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-985mb_668d3b91-6679-4a3c-af26-520078ad97bd/networking-console-plugin/0.log" Apr 21 04:39:36.178902 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.178872 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q"] Apr 21 04:39:36.182170 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.182149 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:36.184477 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.184455 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pv5rw\"/\"kube-root-ca.crt\"" Apr 21 04:39:36.184599 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.184539 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pv5rw\"/\"openshift-service-ca.crt\"" Apr 21 04:39:36.185533 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.185507 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-pv5rw\"/\"default-dockercfg-zc9pn\"" Apr 21 04:39:36.189060 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.189027 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q"] Apr 21 04:39:36.262298 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.262256 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6cxd\" (UniqueName: \"kubernetes.io/projected/707d080d-f09d-4ac2-bc2a-f3b605e4d935-kube-api-access-t6cxd\") pod \"perf-node-gather-daemonset-84t2q\" (UID: \"707d080d-f09d-4ac2-bc2a-f3b605e4d935\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:36.262478 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.262311 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/707d080d-f09d-4ac2-bc2a-f3b605e4d935-sys\") pod \"perf-node-gather-daemonset-84t2q\" (UID: \"707d080d-f09d-4ac2-bc2a-f3b605e4d935\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:36.262478 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.262335 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/707d080d-f09d-4ac2-bc2a-f3b605e4d935-lib-modules\") pod \"perf-node-gather-daemonset-84t2q\" (UID: \"707d080d-f09d-4ac2-bc2a-f3b605e4d935\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:36.262478 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.262402 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/707d080d-f09d-4ac2-bc2a-f3b605e4d935-proc\") pod \"perf-node-gather-daemonset-84t2q\" (UID: \"707d080d-f09d-4ac2-bc2a-f3b605e4d935\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:36.262604 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.262478 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/707d080d-f09d-4ac2-bc2a-f3b605e4d935-podres\") pod \"perf-node-gather-daemonset-84t2q\" (UID: \"707d080d-f09d-4ac2-bc2a-f3b605e4d935\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:36.363157 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.363121 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6cxd\" (UniqueName: \"kubernetes.io/projected/707d080d-f09d-4ac2-bc2a-f3b605e4d935-kube-api-access-t6cxd\") pod \"perf-node-gather-daemonset-84t2q\" (UID: \"707d080d-f09d-4ac2-bc2a-f3b605e4d935\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:36.363157 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.363162 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/707d080d-f09d-4ac2-bc2a-f3b605e4d935-sys\") pod \"perf-node-gather-daemonset-84t2q\" (UID: \"707d080d-f09d-4ac2-bc2a-f3b605e4d935\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:36.363412 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.363177 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/707d080d-f09d-4ac2-bc2a-f3b605e4d935-lib-modules\") pod \"perf-node-gather-daemonset-84t2q\" (UID: \"707d080d-f09d-4ac2-bc2a-f3b605e4d935\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:36.363412 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.363207 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/707d080d-f09d-4ac2-bc2a-f3b605e4d935-proc\") pod \"perf-node-gather-daemonset-84t2q\" (UID: \"707d080d-f09d-4ac2-bc2a-f3b605e4d935\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:36.363412 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.363239 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/707d080d-f09d-4ac2-bc2a-f3b605e4d935-podres\") pod \"perf-node-gather-daemonset-84t2q\" (UID: \"707d080d-f09d-4ac2-bc2a-f3b605e4d935\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:36.363412 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.363251 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/707d080d-f09d-4ac2-bc2a-f3b605e4d935-sys\") pod \"perf-node-gather-daemonset-84t2q\" (UID: \"707d080d-f09d-4ac2-bc2a-f3b605e4d935\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:36.363412 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.363309 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/707d080d-f09d-4ac2-bc2a-f3b605e4d935-proc\") pod \"perf-node-gather-daemonset-84t2q\" (UID: \"707d080d-f09d-4ac2-bc2a-f3b605e4d935\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:36.363412 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.363352 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/707d080d-f09d-4ac2-bc2a-f3b605e4d935-podres\") pod \"perf-node-gather-daemonset-84t2q\" (UID: \"707d080d-f09d-4ac2-bc2a-f3b605e4d935\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:36.363412 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.363374 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/707d080d-f09d-4ac2-bc2a-f3b605e4d935-lib-modules\") pod \"perf-node-gather-daemonset-84t2q\" (UID: \"707d080d-f09d-4ac2-bc2a-f3b605e4d935\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:36.371592 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.371568 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6cxd\" (UniqueName: \"kubernetes.io/projected/707d080d-f09d-4ac2-bc2a-f3b605e4d935-kube-api-access-t6cxd\") pod \"perf-node-gather-daemonset-84t2q\" (UID: \"707d080d-f09d-4ac2-bc2a-f3b605e4d935\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:36.493405 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.493321 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:36.617055 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:36.617023 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q"] Apr 21 04:39:36.618556 ip-10-0-140-163 kubenswrapper[2574]: W0421 04:39:36.618524 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod707d080d_f09d_4ac2_bc2a_f3b605e4d935.slice/crio-515853f4dabceb75ca46aea350ecce12e99f1916ee8cf328b2cadb85151c1271 WatchSource:0}: Error finding container 515853f4dabceb75ca46aea350ecce12e99f1916ee8cf328b2cadb85151c1271: Status 404 returned error can't find the container with id 515853f4dabceb75ca46aea350ecce12e99f1916ee8cf328b2cadb85151c1271 Apr 21 04:39:37.012988 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:37.012944 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" event={"ID":"707d080d-f09d-4ac2-bc2a-f3b605e4d935","Type":"ContainerStarted","Data":"5ff970669b4857c782d55c9ae9166b50a86ae88a5237b27720d2b9402eff1a94"} Apr 21 04:39:37.012988 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:37.012983 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" event={"ID":"707d080d-f09d-4ac2-bc2a-f3b605e4d935","Type":"ContainerStarted","Data":"515853f4dabceb75ca46aea350ecce12e99f1916ee8cf328b2cadb85151c1271"} Apr 21 04:39:37.013497 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:37.013019 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:37.030244 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:37.030194 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" podStartSLOduration=1.030180668 podStartE2EDuration="1.030180668s" podCreationTimestamp="2026-04-21 04:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:39:37.027187923 +0000 UTC m=+955.887291903" watchObservedRunningTime="2026-04-21 04:39:37.030180668 +0000 UTC m=+955.890284647" Apr 21 04:39:38.146768 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:38.146735 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tz8h8_9891f7eb-3b3e-4b55-8ce3-97281e95db7e/dns/0.log" Apr 21 04:39:38.164212 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:38.164185 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tz8h8_9891f7eb-3b3e-4b55-8ce3-97281e95db7e/kube-rbac-proxy/0.log" Apr 21 04:39:38.183335 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:38.183308 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7b454_9d7bcf82-4223-4a89-a957-2d50a0af065e/dns-node-resolver/0.log" Apr 21 04:39:38.850417 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:38.850391 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pqxgt_59fe2a00-5ff4-4e40-b04e-88c5b7b9b743/node-ca/0.log" Apr 21 04:39:39.647654 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:39.647624 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfxthw6_cbe8fefc-8e21-492e-8023-06c30c0e6259/istio-proxy/0.log" Apr 21 04:39:39.770375 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:39.770326 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-78vcq_00e10d29-e52c-4844-ba3d-64f2e59fd441/istio-proxy/0.log" Apr 21 04:39:40.275335 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:40.275303 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rbw56_65dc137e-873c-4d24-b876-d35819405cac/serve-healthcheck-canary/0.log" Apr 21 04:39:40.799085 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:40.799053 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-crssf_81d1ae8a-86ba-4852-83da-1e18ebe907b1/kube-rbac-proxy/0.log" Apr 21 04:39:40.818425 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:40.818400 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-crssf_81d1ae8a-86ba-4852-83da-1e18ebe907b1/exporter/0.log" Apr 21 04:39:40.838846 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:40.838807 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-crssf_81d1ae8a-86ba-4852-83da-1e18ebe907b1/extractor/0.log" Apr 21 04:39:42.792207 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:42.792171 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-5dc44bd94b-cct56_91342866-b823-44bb-b2fe-98e443f14fc2/manager/0.log" Apr 21 04:39:42.905814 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:42.905780 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5b6f69cdb8-wsldk_bac42821-e633-4da9-bb86-c0f282c8a751/manager/0.log" Apr 21 04:39:42.926388 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:42.926357 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-ccgk7_0fe159dc-e5ab-451b-a362-091d22df0bb9/postgres/0.log" Apr 21 04:39:43.027116 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:43.027089 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-84t2q" Apr 21 04:39:44.128581 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:44.128547 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-579f6d4cb9-jbdjh_8142b997-841e-4633-ac0f-3a0a600c0bca/manager/0.log" Apr 21 04:39:49.815831 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:49.815795 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-77fkm_a70a5ac4-3cb0-4b1d-b4c0-243990f39d76/kube-multus/0.log" Apr 21 04:39:50.181452 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:50.181418 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-85nl4_2b3ed1e6-f33c-4bff-99f0-e47538021110/kube-multus-additional-cni-plugins/0.log" Apr 21 04:39:50.223304 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:50.223277 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-85nl4_2b3ed1e6-f33c-4bff-99f0-e47538021110/egress-router-binary-copy/0.log" Apr 21 04:39:50.259375 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:50.259344 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-85nl4_2b3ed1e6-f33c-4bff-99f0-e47538021110/cni-plugins/0.log" Apr 21 04:39:50.289824 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:50.289792 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-85nl4_2b3ed1e6-f33c-4bff-99f0-e47538021110/bond-cni-plugin/0.log" Apr 21 04:39:50.320428 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:50.320400 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-85nl4_2b3ed1e6-f33c-4bff-99f0-e47538021110/routeoverride-cni/0.log" Apr 21 04:39:50.356230 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:50.356202 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-85nl4_2b3ed1e6-f33c-4bff-99f0-e47538021110/whereabouts-cni-bincopy/0.log" Apr 21 04:39:50.388467 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:50.388441 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-85nl4_2b3ed1e6-f33c-4bff-99f0-e47538021110/whereabouts-cni/0.log" Apr 21 04:39:50.600217 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:50.600189 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bwjm7_c904be64-c3a8-4c8b-97c6-75f15dbf9670/network-metrics-daemon/0.log" Apr 21 04:39:50.616931 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:50.616900 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bwjm7_c904be64-c3a8-4c8b-97c6-75f15dbf9670/kube-rbac-proxy/0.log" Apr 21 04:39:51.700854 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:51.700820 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/ovn-controller/0.log" Apr 21 04:39:51.717638 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:51.717611 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/ovn-acl-logging/0.log" Apr 21 04:39:51.727116 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:51.727092 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/ovn-acl-logging/1.log" Apr 21 04:39:51.749338 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:51.749310 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/kube-rbac-proxy-node/0.log" Apr 21 04:39:51.772074 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:51.772035 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 04:39:51.791160 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:51.791135 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/northd/0.log" Apr 21 04:39:51.810632 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:51.810610 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/nbdb/0.log" Apr 21 04:39:51.833455 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:51.833432 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/sbdb/0.log" Apr 21 04:39:52.003385 ip-10-0-140-163 kubenswrapper[2574]: I0421 04:39:52.003307 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-szzqq_c6af1b08-59ae-47ba-9588-6c079464ff78/ovnkube-controller/0.log"