Apr 22 16:21:40.178328 ip-10-0-142-238 systemd[1]: Starting Kubernetes Kubelet... Apr 22 16:21:40.632442 ip-10-0-142-238 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 16:21:40.632442 ip-10-0-142-238 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 16:21:40.632442 ip-10-0-142-238 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 16:21:40.632442 ip-10-0-142-238 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 16:21:40.632442 ip-10-0-142-238 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 16:21:40.635085 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.634987 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 16:21:40.638337 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638321 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 16:21:40.638337 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638338 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638341 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638344 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638347 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638351 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638353 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638358 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638361 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638363 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638366 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638368 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638371 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638375 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638378 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638381 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638383 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638386 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638388 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638390 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638393 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 16:21:40.638399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638395 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638398 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638400 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638403 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638406 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638409 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638412 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638414 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638417 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638419 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638422 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638426 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638430 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638433 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638435 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638438 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638442 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638444 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638446 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 16:21:40.638875 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638450 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638453 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638456 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638459 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638461 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638464 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638467 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638470 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638472 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638475 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638477 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638479 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638482 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638484 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638486 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638490 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638492 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638495 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638497 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638500 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 16:21:40.639407 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638503 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638505 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638509 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638511 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638514 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638516 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638519 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638521 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638524 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638526 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638529 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638531 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638533 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638536 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638538 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638541 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638543 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638546 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638548 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 16:21:40.639876 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638551 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638553 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638556 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638558 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638560 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638563 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638565 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638928 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638933 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638937 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638939 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638942 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638945 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638947 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638949 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638952 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638955 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638957 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638959 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638962 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 16:21:40.640331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638964 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638966 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638969 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638973 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638976 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638979 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638981 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638984 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638986 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638989 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638991 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638994 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.638997 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639000 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639002 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639004 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639007 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639010 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639012 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639014 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 16:21:40.640803 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639017 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639020 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639022 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639025 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639027 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639029 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639032 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639034 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639051 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639054 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639057 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639059 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639062 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639064 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639067 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639069 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639072 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639074 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639077 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639079 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 16:21:40.641291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639081 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639084 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639086 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639088 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639092 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639099 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639102 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639104 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639107 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639109 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639113 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639115 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639118 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639121 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639124 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639148 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639152 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639155 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639158 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 16:21:40.641785 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639161 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639165 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639168 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639171 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639174 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639176 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639179 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639181 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639184 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639186 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639189 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639191 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639194 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.639196 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640665 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640679 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640687 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640692 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640696 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640699 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640704 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 16:21:40.642255 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640708 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640711 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640714 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640718 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640722 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640725 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640728 2575 flags.go:64] FLAG: --cgroup-root="" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640731 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640733 2575 flags.go:64] FLAG: --client-ca-file="" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640736 2575 flags.go:64] FLAG: --cloud-config="" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640739 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640742 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640746 2575 flags.go:64] FLAG: --cluster-domain="" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640749 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640752 2575 flags.go:64] FLAG: --config-dir="" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640755 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640758 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640762 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640765 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640768 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640771 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640774 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640776 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640779 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640782 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 16:21:40.642749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640785 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640789 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640793 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640796 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640799 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640802 2575 flags.go:64] FLAG: --enable-server="true" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640805 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640810 2575 flags.go:64] FLAG: --event-burst="100" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640813 2575 flags.go:64] FLAG: --event-qps="50" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640816 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640819 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640822 2575 flags.go:64] FLAG: --eviction-hard="" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640826 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640829 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640832 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640834 2575 flags.go:64] FLAG: --eviction-soft="" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640837 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640840 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640843 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640846 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640848 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640851 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640853 2575 flags.go:64] FLAG: --feature-gates="" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640857 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640860 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 16:21:40.643368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640863 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640866 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640869 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640871 2575 flags.go:64] FLAG: --help="false" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640874 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-142-238.ec2.internal" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640877 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640880 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640882 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640886 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640890 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640892 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640895 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640899 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640902 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640905 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640908 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640910 2575 flags.go:64] FLAG: --kube-reserved="" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640913 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640916 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640919 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640922 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640924 2575 flags.go:64] FLAG: --lock-file="" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640927 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640929 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 16:21:40.643952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640932 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640938 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640941 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640943 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640946 2575 flags.go:64] FLAG: --logging-format="text" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640949 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640952 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640955 2575 flags.go:64] FLAG: --manifest-url="" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640957 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640962 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640965 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640968 2575 flags.go:64] FLAG: --max-pods="110" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640971 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640974 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640976 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640979 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640982 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640985 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640988 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640996 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.640999 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641002 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641004 2575 flags.go:64] FLAG: --pod-cidr="" Apr 22 16:21:40.644531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641008 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641013 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641016 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641019 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641021 2575 flags.go:64] FLAG: --port="10250" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641024 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641027 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-082b24ba11511bc6c" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641030 2575 flags.go:64] FLAG: --qos-reserved="" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641033 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641048 2575 flags.go:64] FLAG: --register-node="true" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641051 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641054 2575 flags.go:64] FLAG: --register-with-taints="" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641058 2575 flags.go:64] FLAG: --registry-burst="10" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641060 2575 flags.go:64] FLAG: --registry-qps="5" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641063 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641066 2575 flags.go:64] FLAG: --reserved-memory="" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641070 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641073 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641078 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641080 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641083 2575 flags.go:64] FLAG: --runonce="false" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641086 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641089 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641093 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641095 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641098 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 16:21:40.645086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641101 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641104 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641107 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641110 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641112 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641115 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641118 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641121 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641124 2575 flags.go:64] FLAG: --system-cgroups="" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641127 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641132 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641134 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641137 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641140 2575 flags.go:64] FLAG: --tls-min-version="" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641143 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641146 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641148 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641151 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641154 2575 flags.go:64] FLAG: --v="2" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641158 2575 flags.go:64] FLAG: --version="false" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641161 2575 flags.go:64] FLAG: --vmodule="" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641165 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.641168 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641259 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 16:21:40.645682 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641265 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641270 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641274 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641277 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641283 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641286 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641288 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641291 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641294 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641296 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641299 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641302 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641305 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641307 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641310 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641313 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641316 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641318 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641321 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641323 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 16:21:40.646299 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641326 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641328 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641330 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641333 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641335 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641338 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641340 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641342 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641345 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641347 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641349 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641352 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641355 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641358 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641360 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641362 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641366 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641368 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641371 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641373 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 16:21:40.646828 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641376 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641378 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641381 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641383 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641386 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641388 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641390 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641393 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641395 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641401 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641403 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641406 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641408 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641411 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641413 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641416 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641418 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641421 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641423 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641425 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 16:21:40.647331 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641427 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641430 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641432 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641435 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641439 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641441 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641444 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641446 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641450 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641452 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641455 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641457 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641459 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641462 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641464 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641466 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641469 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641471 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641474 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641476 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 16:21:40.647806 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641478 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 16:21:40.648460 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641481 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 16:21:40.648460 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641484 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 16:21:40.648460 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641486 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 16:21:40.648460 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.641488 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 16:21:40.648460 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.642246 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 16:21:40.651093 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.651068 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 16:21:40.651093 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.651091 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651139 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651144 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651148 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651151 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651154 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651157 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651160 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651162 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651165 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651168 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651170 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651173 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651176 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651178 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651181 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651183 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651185 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651188 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 16:21:40.651228 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651190 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651193 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651195 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651198 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651200 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651202 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651205 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651208 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651211 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651214 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651216 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651218 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651221 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651223 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651226 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651230 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651232 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651235 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651238 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651240 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 16:21:40.651697 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651242 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651245 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651247 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651249 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651252 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651254 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651257 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651259 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651261 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651264 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651266 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651269 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651271 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651274 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651277 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651279 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651282 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651284 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651287 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 16:21:40.652198 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651289 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651291 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651294 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651296 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651298 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651301 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651303 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651305 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651308 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651310 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651313 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651315 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651317 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651327 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651330 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651333 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651335 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651338 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651342 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651348 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 16:21:40.652647 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651351 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 16:21:40.653178 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651355 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 16:21:40.653178 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651359 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 16:21:40.653178 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651362 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 16:21:40.653178 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651365 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 16:21:40.653178 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651367 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 16:21:40.653178 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651370 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 16:21:40.653178 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651372 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 16:21:40.653178 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651374 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 16:21:40.653178 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.651379 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 16:21:40.653178 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651475 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 16:21:40.653178 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651480 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 16:21:40.653178 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651483 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 16:21:40.653178 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651486 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 16:21:40.653178 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651488 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 16:21:40.653178 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651491 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651494 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651496 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651499 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651501 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651505 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651509 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651512 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651514 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651517 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651519 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651521 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651524 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651526 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651529 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651531 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651533 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651537 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651540 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651542 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 16:21:40.653545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651544 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651547 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651549 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651551 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651554 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651556 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651559 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651561 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651563 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651566 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651568 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651571 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651573 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651576 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651578 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651580 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651583 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651585 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651588 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 16:21:40.654008 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651590 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651592 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651595 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651597 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651599 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651602 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651604 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651607 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651609 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651611 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651614 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651617 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651619 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651622 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651624 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651626 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651629 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651631 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651634 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651638 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 16:21:40.654479 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651641 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651643 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651647 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651649 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651652 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651654 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651657 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651659 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651661 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651663 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651666 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651668 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651671 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651673 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651675 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651677 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651680 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651682 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651684 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651687 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 16:21:40.654945 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651689 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 16:21:40.655440 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:40.651691 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 16:21:40.655440 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.651696 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 16:21:40.655440 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.652360 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 16:21:40.656459 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.656445 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 16:21:40.657453 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.657442 2575 server.go:1019] "Starting client certificate rotation" Apr 22 16:21:40.657556 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.657540 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 16:21:40.657591 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.657582 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 16:21:40.683689 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.683670 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 16:21:40.685961 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.685942 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 16:21:40.697943 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.697925 2575 log.go:25] "Validated CRI v1 runtime API" Apr 22 16:21:40.704806 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.704790 2575 log.go:25] "Validated CRI v1 image API" Apr 22 16:21:40.707928 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.707916 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 16:21:40.711931 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.711913 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 8668a234-979f-4913-851f-19715eb70f46:/dev/nvme0n1p3 b9f0908b-4d8b-4668-a482-2fc037f852cf:/dev/nvme0n1p4] Apr 22 16:21:40.711983 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.711932 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 16:21:40.714287 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.714271 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 16:21:40.717621 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.717517 2575 manager.go:217] Machine: {Timestamp:2026-04-22 16:21:40.71548996 +0000 UTC m=+0.417508061 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3197839 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b5810552b062d0538496a94a1cca4 SystemUUID:ec2b5810-552b-062d-0538-496a94a1cca4 BootID:69b8c955-5ad3-44c0-a15d-95a6888c44f1 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c6:91:46:31:21 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c6:91:46:31:21 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5a:3e:6b:1f:53:75 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 16:21:40.717621 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.717620 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 16:21:40.717721 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.717694 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 16:21:40.718907 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.718881 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 16:21:40.719049 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.718909 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-238.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 16:21:40.719095 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.719060 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 16:21:40.719095 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.719067 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 16:21:40.719095 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.719080 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 16:21:40.719927 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.719917 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 16:21:40.721141 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.721132 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 16:21:40.721238 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.721230 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 16:21:40.723750 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.723741 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 22 16:21:40.723788 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.723754 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 16:21:40.723788 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.723765 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 16:21:40.723788 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.723773 2575 kubelet.go:397] "Adding apiserver pod source" Apr 22 16:21:40.723788 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.723781 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 16:21:40.724885 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.724873 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 16:21:40.724932 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.724892 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 16:21:40.727752 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.727735 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 16:21:40.729642 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.729625 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cmrh2" Apr 22 16:21:40.729890 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.729875 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 16:21:40.731985 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.731970 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 16:21:40.732083 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.731992 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 16:21:40.732083 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.732002 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 16:21:40.732083 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.732009 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 16:21:40.732083 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.732018 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 16:21:40.732083 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.732025 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 16:21:40.732083 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.732035 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 16:21:40.732083 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.732061 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 16:21:40.732083 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.732082 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 16:21:40.732083 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.732091 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 16:21:40.732373 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.732120 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 16:21:40.732373 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.732134 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 16:21:40.733002 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.732992 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 16:21:40.733067 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.733004 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 16:21:40.736877 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.736857 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-238.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 16:21:40.736877 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.736869 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cmrh2" Apr 22 16:21:40.737065 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:40.737023 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-238.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 16:21:40.737100 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:40.737074 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 16:21:40.738334 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.738321 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 16:21:40.738386 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.738366 2575 server.go:1295] "Started kubelet" Apr 22 16:21:40.738499 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.738461 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 16:21:40.738559 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.738495 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 16:21:40.738607 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.738597 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 16:21:40.739425 ip-10-0-142-238 systemd[1]: Started Kubernetes Kubelet. Apr 22 16:21:40.739574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.739563 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 22 16:21:40.743728 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.743696 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 16:21:40.748068 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:40.748031 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 16:21:40.748882 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:40.747401 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-238.ec2.internal.18a8ba4ed4d0df14 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-238.ec2.internal,UID:ip-10-0-142-238.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-238.ec2.internal,},FirstTimestamp:2026-04-22 16:21:40.738334484 +0000 UTC m=+0.440352590,LastTimestamp:2026-04-22 16:21:40.738334484 +0000 UTC m=+0.440352590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-238.ec2.internal,}" Apr 22 16:21:40.749751 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.749734 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 16:21:40.750324 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.750310 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 16:21:40.751022 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.751001 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 16:21:40.751127 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.751026 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 16:21:40.751127 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:40.751092 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-238.ec2.internal\" not found" Apr 22 16:21:40.751224 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.751186 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 22 16:21:40.751224 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.751194 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 22 16:21:40.751307 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.751268 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 16:21:40.751307 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.751282 2575 factory.go:55] Registering systemd factory Apr 22 16:21:40.751307 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.751290 2575 factory.go:223] Registration of the systemd container factory successfully Apr 22 16:21:40.751919 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.751899 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 16:21:40.752173 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.752158 2575 factory.go:153] Registering CRI-O factory Apr 22 16:21:40.752267 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.752176 2575 factory.go:223] Registration of the crio container factory successfully Apr 22 16:21:40.752267 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.752198 2575 factory.go:103] Registering Raw factory Apr 22 16:21:40.752267 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.752212 2575 manager.go:1196] Started watching for new ooms in manager Apr 22 16:21:40.754609 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.752589 2575 manager.go:319] Starting recovery of all containers Apr 22 16:21:40.760243 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.760093 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:40.762695 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.762675 2575 manager.go:324] Recovery completed Apr 22 16:21:40.763239 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:40.763217 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-142-238.ec2.internal\" not found" node="ip-10-0-142-238.ec2.internal" Apr 22 16:21:40.764376 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:40.764357 2575 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 16:21:40.767057 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.767030 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:40.770605 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.770591 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:40.770652 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.770618 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:40.770652 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.770629 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:40.771159 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.771144 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 16:21:40.771159 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.771156 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 16:21:40.771278 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.771172 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 16:21:40.774182 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.774170 2575 policy_none.go:49] "None policy: Start" Apr 22 16:21:40.774219 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.774187 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 16:21:40.774219 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.774197 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 22 16:21:40.815643 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.815627 2575 manager.go:341] "Starting Device Plugin manager" Apr 22 16:21:40.833432 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:40.815659 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 16:21:40.833432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.815668 2575 server.go:85] "Starting device plugin registration server" Apr 22 16:21:40.833432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.816220 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 16:21:40.833432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.816236 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 16:21:40.833432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.816359 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 16:21:40.833432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.816451 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 16:21:40.833432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.816459 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 16:21:40.833432 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:40.818480 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 16:21:40.833432 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:40.818513 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-238.ec2.internal\" not found" Apr 22 16:21:40.898651 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.898594 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 16:21:40.899848 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.899825 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 16:21:40.899896 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.899862 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 16:21:40.899896 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.899890 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 16:21:40.899964 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.899900 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 16:21:40.899964 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:40.899938 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 16:21:40.902315 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.902291 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:40.917378 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.917360 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:40.918447 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.918431 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:40.918540 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.918458 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:40.918540 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.918468 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:40.918540 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.918493 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-238.ec2.internal" Apr 22 16:21:40.926112 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:40.926097 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-238.ec2.internal" Apr 22 16:21:40.926162 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:40.926120 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-238.ec2.internal\": node \"ip-10-0-142-238.ec2.internal\" not found" Apr 22 16:21:40.944336 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:40.944317 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-238.ec2.internal\" not found" Apr 22 16:21:41.000053 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.000012 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-142-238.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal"] Apr 22 16:21:41.000139 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.000106 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:41.001059 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.001027 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:41.001117 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.001070 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:41.001117 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.001083 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:41.002429 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.002417 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:41.002579 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.002565 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-238.ec2.internal" Apr 22 16:21:41.002615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.002594 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:41.003406 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.003385 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:41.003406 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.003408 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:41.003523 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.003418 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:41.003523 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.003389 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:41.003523 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.003490 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:41.003523 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.003501 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:41.004602 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.004589 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal" Apr 22 16:21:41.004677 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.004611 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:41.005235 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.005223 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:41.005290 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.005245 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:41.005290 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.005254 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:41.025694 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:41.025675 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-238.ec2.internal\" not found" node="ip-10-0-142-238.ec2.internal" Apr 22 16:21:41.030004 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:41.029988 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-238.ec2.internal\" not found" node="ip-10-0-142-238.ec2.internal" Apr 22 16:21:41.044938 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:41.044902 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-238.ec2.internal\" not found" Apr 22 16:21:41.145361 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:41.145336 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-238.ec2.internal\" not found" Apr 22 16:21:41.153770 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.153683 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cce540b655be06a7bb42ab4ffff03c36-config\") pod \"kube-apiserver-proxy-ip-10-0-142-238.ec2.internal\" (UID: \"cce540b655be06a7bb42ab4ffff03c36\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-238.ec2.internal" Apr 22 16:21:41.153770 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.153754 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/292a4af51785efd6d8e1973dc2f1a57e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal\" (UID: \"292a4af51785efd6d8e1973dc2f1a57e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal" Apr 22 16:21:41.153914 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.153823 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/292a4af51785efd6d8e1973dc2f1a57e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal\" (UID: \"292a4af51785efd6d8e1973dc2f1a57e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal" Apr 22 16:21:41.246054 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:41.246010 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-238.ec2.internal\" not found" Apr 22 16:21:41.254443 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.254411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cce540b655be06a7bb42ab4ffff03c36-config\") pod \"kube-apiserver-proxy-ip-10-0-142-238.ec2.internal\" (UID: \"cce540b655be06a7bb42ab4ffff03c36\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-238.ec2.internal" Apr 22 16:21:41.254518 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.254448 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/292a4af51785efd6d8e1973dc2f1a57e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal\" (UID: \"292a4af51785efd6d8e1973dc2f1a57e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal" Apr 22 16:21:41.254518 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.254467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/292a4af51785efd6d8e1973dc2f1a57e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal\" (UID: \"292a4af51785efd6d8e1973dc2f1a57e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal" Apr 22 16:21:41.254518 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.254493 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/292a4af51785efd6d8e1973dc2f1a57e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal\" (UID: \"292a4af51785efd6d8e1973dc2f1a57e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal" Apr 22 16:21:41.254518 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.254511 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/292a4af51785efd6d8e1973dc2f1a57e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal\" (UID: \"292a4af51785efd6d8e1973dc2f1a57e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal" Apr 22 16:21:41.254649 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.254513 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cce540b655be06a7bb42ab4ffff03c36-config\") pod \"kube-apiserver-proxy-ip-10-0-142-238.ec2.internal\" (UID: \"cce540b655be06a7bb42ab4ffff03c36\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-238.ec2.internal" Apr 22 16:21:41.328604 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.328580 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-238.ec2.internal" Apr 22 16:21:41.333829 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.333814 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal" Apr 22 16:21:41.346094 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:41.346076 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-238.ec2.internal\" not found" Apr 22 16:21:41.446717 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:41.446634 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-238.ec2.internal\" not found" Apr 22 16:21:41.547292 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:41.547266 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-238.ec2.internal\" not found" Apr 22 16:21:41.647935 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:41.647903 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-238.ec2.internal\" not found" Apr 22 16:21:41.657297 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.657272 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 16:21:41.657503 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.657478 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 16:21:41.657596 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.657478 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 16:21:41.739507 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.739436 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 16:16:40 +0000 UTC" deadline="2027-12-07 18:06:55.677330096 +0000 UTC" Apr 22 16:21:41.739507 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.739459 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14257h45m13.937873475s" Apr 22 16:21:41.748890 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:41.748871 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-238.ec2.internal\" not found" Apr 22 16:21:41.749928 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.749918 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 16:21:41.759402 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.759387 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 16:21:41.787937 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.787914 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vd2sx" Apr 22 16:21:41.796744 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.796726 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vd2sx" Apr 22 16:21:41.849104 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:41.849082 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-238.ec2.internal\" not found" Apr 22 16:21:41.948799 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:41.948785 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:21:41.949150 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:41.949133 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-238.ec2.internal\" not found" Apr 22 16:21:42.050086 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:42.049991 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-238.ec2.internal\" not found" Apr 22 16:21:42.085549 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.085526 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:42.150683 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:42.150655 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-238.ec2.internal\" not found" Apr 22 16:21:42.251424 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:42.251395 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-238.ec2.internal\" not found" Apr 22 16:21:42.311240 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.311181 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:42.351096 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.351071 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal" Apr 22 16:21:42.362650 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.362612 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 16:21:42.363712 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.363693 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-238.ec2.internal" Apr 22 16:21:42.371839 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.371753 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 16:21:42.502599 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.502450 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:42.725358 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.725276 2575 apiserver.go:52] "Watching apiserver" Apr 22 16:21:42.733912 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.733889 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 16:21:42.736319 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.736294 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-zkfjr","openshift-multus/multus-d5sxk","openshift-multus/network-metrics-daemon-5wqw7","openshift-ovn-kubernetes/ovnkube-node-pxgm2","kube-system/kube-apiserver-proxy-ip-10-0-142-238.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7","openshift-dns/node-resolver-jg6wx","openshift-image-registry/node-ca-xkxjl","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal","openshift-multus/multus-additional-cni-plugins-4gkbb","openshift-network-diagnostics/network-check-target-qf7zg","openshift-network-operator/iptables-alerter-cjtpq","kube-system/konnectivity-agent-4c4gx"] Apr 22 16:21:42.738005 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.737987 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.739467 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.739184 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.744063 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.742504 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 16:21:42.744063 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.742598 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 16:21:42.744063 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.743008 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-tx5sg\"" Apr 22 16:21:42.744063 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.743204 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 16:21:42.744063 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.743249 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 16:21:42.744063 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.743647 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 16:21:42.744063 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.743997 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 16:21:42.744418 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.744150 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:42.744418 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:42.744237 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:21:42.745789 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.745766 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8j2dt\"" Apr 22 16:21:42.747258 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.747240 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.749000 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.748754 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.749189 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.749166 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 16:21:42.749436 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.749418 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 16:21:42.749649 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.749630 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 16:21:42.749806 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.749785 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7g8r8\"" Apr 22 16:21:42.750169 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.750152 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jg6wx" Apr 22 16:21:42.750545 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.750525 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.751357 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.751129 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 16:21:42.751357 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.751223 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 16:21:42.751842 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.751828 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xkxjl" Apr 22 16:21:42.752798 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.752779 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-h4nm5\"" Apr 22 16:21:42.753012 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.752992 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 16:21:42.753107 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.753082 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-bqrwn\"" Apr 22 16:21:42.753221 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.753207 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 16:21:42.753288 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.753273 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:21:42.753323 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.753306 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 16:21:42.753434 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.752785 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 16:21:42.753518 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.753501 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wzh6z\"" Apr 22 16:21:42.753674 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.753659 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 16:21:42.753674 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.753554 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 16:21:42.753759 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.753561 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 16:21:42.753996 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.753963 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 16:21:42.754245 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.754229 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-d5dct\"" Apr 22 16:21:42.754376 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.754359 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 16:21:42.754445 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.754407 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 16:21:42.754908 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.754891 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:21:42.755011 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:42.754957 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qf7zg" podUID="8c01b2f6-2652-4b52-88bd-aa16905c79db" Apr 22 16:21:42.757684 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.757668 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4c4gx" Apr 22 16:21:42.757783 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.757673 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cjtpq" Apr 22 16:21:42.759747 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.759729 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 16:21:42.760003 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.759986 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 16:21:42.760924 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.760381 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 16:21:42.760924 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.760409 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 16:21:42.760924 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.760588 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:21:42.760924 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.760703 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-x4zlt\"" Apr 22 16:21:42.760924 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.760754 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-r28mx\"" Apr 22 16:21:42.763317 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763295 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-run-ovn\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.763419 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763336 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-hostroot\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.763419 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763361 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b508f921-8bf7-4ed5-858a-04f8cf475055-multus-daemon-config\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.763419 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763384 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-var-lib-cni-bin\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.763419 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763411 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-var-lib-cni-multus\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.763614 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763434 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdv4c\" (UniqueName: \"kubernetes.io/projected/4047cdbf-3d64-46a4-b565-550cd28e6eac-kube-api-access-wdv4c\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.763614 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763458 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-sys\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.763614 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763493 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-var-lib-openvswitch\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.763614 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763518 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b508f921-8bf7-4ed5-858a-04f8cf475055-cni-binary-copy\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.763614 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763538 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-sysconfig\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.763614 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763560 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/020a35c8-d40b-477c-8c6e-1530096b3f1a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.763614 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763582 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-kubelet\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.763939 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763616 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46h59\" (UniqueName: \"kubernetes.io/projected/bbe5211a-cea9-4848-93bc-eaa0e38f906d-kube-api-access-46h59\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.763939 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763641 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-system-cni-dir\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.763939 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763667 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-multus-socket-dir-parent\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.763939 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763690 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-registration-dir\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.763939 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763711 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-sysctl-conf\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.763939 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763744 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-lib-modules\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.763939 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763766 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a929b972-02c0-4e8a-b302-09406b1c441c-serviceca\") pod \"node-ca-xkxjl\" (UID: \"a929b972-02c0-4e8a-b302-09406b1c441c\") " pod="openshift-image-registry/node-ca-xkxjl" Apr 22 16:21:42.763939 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763790 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/020a35c8-d40b-477c-8c6e-1530096b3f1a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.763939 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763820 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-run-openvswitch\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.763939 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763844 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bbe5211a-cea9-4848-93bc-eaa0e38f906d-ovnkube-config\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.763939 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763884 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/91576000-c254-43f5-84ba-7029c347da22-hosts-file\") pod \"node-resolver-jg6wx\" (UID: \"91576000-c254-43f5-84ba-7029c347da22\") " pod="openshift-dns/node-resolver-jg6wx" Apr 22 16:21:42.763939 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-os-release\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.763939 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763938 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-etc-kubernetes\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.764582 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763961 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p57nt\" (UniqueName: \"kubernetes.io/projected/b508f921-8bf7-4ed5-858a-04f8cf475055-kube-api-access-p57nt\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.764582 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.763983 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-node-log\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.764582 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764022 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-cnibin\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.764582 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764062 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs\") pod \"network-metrics-daemon-5wqw7\" (UID: \"09f37d35-30d1-4fc0-a88f-3514e6c16586\") " pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:42.764582 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764086 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-socket-dir\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.764582 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764113 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-etc-selinux\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.764582 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764142 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/020a35c8-d40b-477c-8c6e-1530096b3f1a-cnibin\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.764582 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764165 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-log-socket\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.764582 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764188 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.764582 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764216 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhpj6\" (UniqueName: \"kubernetes.io/projected/91576000-c254-43f5-84ba-7029c347da22-kube-api-access-lhpj6\") pod \"node-resolver-jg6wx\" (UID: \"91576000-c254-43f5-84ba-7029c347da22\") " pod="openshift-dns/node-resolver-jg6wx" Apr 22 16:21:42.764582 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764239 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-sys-fs\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.764582 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764263 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6hdk\" (UniqueName: \"kubernetes.io/projected/a929b972-02c0-4e8a-b302-09406b1c441c-kube-api-access-c6hdk\") pod \"node-ca-xkxjl\" (UID: \"a929b972-02c0-4e8a-b302-09406b1c441c\") " pod="openshift-image-registry/node-ca-xkxjl" Apr 22 16:21:42.764582 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764295 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-slash\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.764582 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764321 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-multus-cni-dir\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.764582 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764350 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.764582 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764417 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-device-dir\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.765372 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764440 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-systemd\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.765372 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-systemd-units\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.765372 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764502 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-run-ovn-kubernetes\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.765372 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764527 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-cni-bin\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.765372 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764548 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bbe5211a-cea9-4848-93bc-eaa0e38f906d-ovnkube-script-lib\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.765372 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764569 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-run-netns\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.765372 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764599 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-kubernetes\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.765372 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764642 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-tuned\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.765372 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764684 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-modprobe-d\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.765372 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764713 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-sysctl-d\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.765372 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764731 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a929b972-02c0-4e8a-b302-09406b1c441c-host\") pod \"node-ca-xkxjl\" (UID: \"a929b972-02c0-4e8a-b302-09406b1c441c\") " pod="openshift-image-registry/node-ca-xkxjl" Apr 22 16:21:42.765372 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764746 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bbe5211a-cea9-4848-93bc-eaa0e38f906d-env-overrides\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.765372 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764767 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vxcb\" (UniqueName: \"kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb\") pod \"network-check-target-qf7zg\" (UID: \"8c01b2f6-2652-4b52-88bd-aa16905c79db\") " pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:21:42.765372 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764788 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxx6x\" (UniqueName: \"kubernetes.io/projected/09f37d35-30d1-4fc0-a88f-3514e6c16586-kube-api-access-gxx6x\") pod \"network-metrics-daemon-5wqw7\" (UID: \"09f37d35-30d1-4fc0-a88f-3514e6c16586\") " pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:42.765372 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764801 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-etc-openvswitch\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.765372 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764814 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bbe5211a-cea9-4848-93bc-eaa0e38f906d-ovn-node-metrics-cert\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.766120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764828 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-var-lib-kubelet\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.766120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.764840 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-host\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.766120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.765399 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/020a35c8-d40b-477c-8c6e-1530096b3f1a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.766120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.765514 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-run-netns\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.766120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.765544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-run-k8s-cni-cncf-io\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.766120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.765577 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-var-lib-kubelet\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.766120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.765625 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/020a35c8-d40b-477c-8c6e-1530096b3f1a-os-release\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.766120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.765656 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/020a35c8-d40b-477c-8c6e-1530096b3f1a-cni-binary-copy\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.766120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.765682 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/020a35c8-d40b-477c-8c6e-1530096b3f1a-system-cni-dir\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.766120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.765715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/91576000-c254-43f5-84ba-7029c347da22-tmp-dir\") pod \"node-resolver-jg6wx\" (UID: \"91576000-c254-43f5-84ba-7029c347da22\") " pod="openshift-dns/node-resolver-jg6wx" Apr 22 16:21:42.766120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.765745 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-multus-conf-dir\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.766120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.765777 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-run-multus-certs\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.766120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.765827 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2124d5bf-41a3-4af2-ab62-5bec35d4e264-tmp\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.766120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.765852 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkhzg\" (UniqueName: \"kubernetes.io/projected/2124d5bf-41a3-4af2-ab62-5bec35d4e264-kube-api-access-pkhzg\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.766120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.765883 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-cni-netd\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.766120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.765915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rks4\" (UniqueName: \"kubernetes.io/projected/020a35c8-d40b-477c-8c6e-1530096b3f1a-kube-api-access-4rks4\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.766799 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.765946 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-run-systemd\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.766799 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.765979 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-run\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.799398 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.799368 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 16:16:41 +0000 UTC" deadline="2027-12-17 03:29:59.906816108 +0000 UTC" Apr 22 16:21:42.799506 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.799399 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14483h8m17.107420978s" Apr 22 16:21:42.853302 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.853277 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 16:21:42.866412 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866385 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/020a35c8-d40b-477c-8c6e-1530096b3f1a-system-cni-dir\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.866557 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/91576000-c254-43f5-84ba-7029c347da22-tmp-dir\") pod \"node-resolver-jg6wx\" (UID: \"91576000-c254-43f5-84ba-7029c347da22\") " pod="openshift-dns/node-resolver-jg6wx" Apr 22 16:21:42.866557 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866437 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-multus-conf-dir\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.866557 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-run-multus-certs\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.866557 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866481 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2124d5bf-41a3-4af2-ab62-5bec35d4e264-tmp\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.866557 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866507 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/020a35c8-d40b-477c-8c6e-1530096b3f1a-system-cni-dir\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.866557 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-multus-conf-dir\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.866557 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-run-multus-certs\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.866557 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866551 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkhzg\" (UniqueName: \"kubernetes.io/projected/2124d5bf-41a3-4af2-ab62-5bec35d4e264-kube-api-access-pkhzg\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-cni-netd\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866598 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rks4\" (UniqueName: \"kubernetes.io/projected/020a35c8-d40b-477c-8c6e-1530096b3f1a-kube-api-access-4rks4\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866623 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-run-systemd\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866637 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-run\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866657 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-cni-netd\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866687 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-run-ovn\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866654 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-run-ovn\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866717 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-hostroot\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866727 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-run-systemd\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866732 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b508f921-8bf7-4ed5-858a-04f8cf475055-multus-daemon-config\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866795 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-hostroot\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866796 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/91576000-c254-43f5-84ba-7029c347da22-tmp-dir\") pod \"node-resolver-jg6wx\" (UID: \"91576000-c254-43f5-84ba-7029c347da22\") " pod="openshift-dns/node-resolver-jg6wx" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866811 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-var-lib-cni-bin\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866817 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866843 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-var-lib-cni-multus\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866871 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdv4c\" (UniqueName: \"kubernetes.io/projected/4047cdbf-3d64-46a4-b565-550cd28e6eac-kube-api-access-wdv4c\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866867 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-var-lib-cni-bin\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.866930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866894 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-sys\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866904 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-var-lib-cni-multus\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-var-lib-openvswitch\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866945 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b508f921-8bf7-4ed5-858a-04f8cf475055-cni-binary-copy\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866953 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-sys\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866970 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-sysconfig\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866997 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/020a35c8-d40b-477c-8c6e-1530096b3f1a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867006 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-var-lib-openvswitch\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867093 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-sysconfig\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.866819 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-run\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867127 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-kubelet\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867157 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46h59\" (UniqueName: \"kubernetes.io/projected/bbe5211a-cea9-4848-93bc-eaa0e38f906d-kube-api-access-46h59\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867197 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-system-cni-dir\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-multus-socket-dir-parent\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867244 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-registration-dir\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867252 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-kubelet\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867260 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b508f921-8bf7-4ed5-858a-04f8cf475055-multus-daemon-config\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867267 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-sysctl-conf\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.867782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-lib-modules\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867315 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-system-cni-dir\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867317 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a929b972-02c0-4e8a-b302-09406b1c441c-serviceca\") pod \"node-ca-xkxjl\" (UID: \"a929b972-02c0-4e8a-b302-09406b1c441c\") " pod="openshift-image-registry/node-ca-xkxjl" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867385 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-multus-socket-dir-parent\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867492 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-lib-modules\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b508f921-8bf7-4ed5-858a-04f8cf475055-cni-binary-copy\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867536 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-registration-dir\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867558 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/020a35c8-d40b-477c-8c6e-1530096b3f1a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-run-openvswitch\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867598 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-sysctl-conf\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867599 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bbe5211a-cea9-4848-93bc-eaa0e38f906d-ovnkube-config\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867610 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/020a35c8-d40b-477c-8c6e-1530096b3f1a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/91576000-c254-43f5-84ba-7029c347da22-hosts-file\") pod \"node-resolver-jg6wx\" (UID: \"91576000-c254-43f5-84ba-7029c347da22\") " pod="openshift-dns/node-resolver-jg6wx" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867651 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-run-openvswitch\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-os-release\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867690 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-etc-kubernetes\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p57nt\" (UniqueName: \"kubernetes.io/projected/b508f921-8bf7-4ed5-858a-04f8cf475055-kube-api-access-p57nt\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867738 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a929b972-02c0-4e8a-b302-09406b1c441c-serviceca\") pod \"node-ca-xkxjl\" (UID: \"a929b972-02c0-4e8a-b302-09406b1c441c\") " pod="openshift-image-registry/node-ca-xkxjl" Apr 22 16:21:42.868511 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867727 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/91576000-c254-43f5-84ba-7029c347da22-hosts-file\") pod \"node-resolver-jg6wx\" (UID: \"91576000-c254-43f5-84ba-7029c347da22\") " pod="openshift-dns/node-resolver-jg6wx" Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867782 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-etc-kubernetes\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-os-release\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867869 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-node-log\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-cnibin\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867954 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs\") pod \"network-metrics-daemon-5wqw7\" (UID: \"09f37d35-30d1-4fc0-a88f-3514e6c16586\") " pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867975 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-socket-dir\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867956 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-node-log\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867989 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-cnibin\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.867995 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-etc-selinux\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868021 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/020a35c8-d40b-477c-8c6e-1530096b3f1a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868022 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3f7797ac-2216-4ec7-b9fa-f20eb6f39230-iptables-alerter-script\") pod \"iptables-alerter-cjtpq\" (UID: \"3f7797ac-2216-4ec7-b9fa-f20eb6f39230\") " pod="openshift-network-operator/iptables-alerter-cjtpq" Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868078 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-socket-dir\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868086 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfzr6\" (UniqueName: \"kubernetes.io/projected/3f7797ac-2216-4ec7-b9fa-f20eb6f39230-kube-api-access-sfzr6\") pod \"iptables-alerter-cjtpq\" (UID: \"3f7797ac-2216-4ec7-b9fa-f20eb6f39230\") " pod="openshift-network-operator/iptables-alerter-cjtpq" Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868113 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-etc-selinux\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/020a35c8-d40b-477c-8c6e-1530096b3f1a-cnibin\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:42.868546 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868611 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-log-socket\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.869237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868628 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/020a35c8-d40b-477c-8c6e-1530096b3f1a-cnibin\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.869925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.869925 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:42.868711 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs podName:09f37d35-30d1-4fc0-a88f-3514e6c16586 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:43.368680115 +0000 UTC m=+3.070698222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs") pod "network-metrics-daemon-5wqw7" (UID: "09f37d35-30d1-4fc0-a88f-3514e6c16586") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:42.869925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868749 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhpj6\" (UniqueName: \"kubernetes.io/projected/91576000-c254-43f5-84ba-7029c347da22-kube-api-access-lhpj6\") pod \"node-resolver-jg6wx\" (UID: \"91576000-c254-43f5-84ba-7029c347da22\") " pod="openshift-dns/node-resolver-jg6wx" Apr 22 16:21:42.869925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868777 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-sys-fs\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.869925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6hdk\" (UniqueName: \"kubernetes.io/projected/a929b972-02c0-4e8a-b302-09406b1c441c-kube-api-access-c6hdk\") pod \"node-ca-xkxjl\" (UID: \"a929b972-02c0-4e8a-b302-09406b1c441c\") " pod="openshift-image-registry/node-ca-xkxjl" Apr 22 16:21:42.869925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868825 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-slash\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.869925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-multus-cni-dir\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.869925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868849 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bbe5211a-cea9-4848-93bc-eaa0e38f906d-ovnkube-config\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.869925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868855 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-sys-fs\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.869925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.869925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.869925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.868896 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.869925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869012 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-device-dir\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.869925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-systemd\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.869925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869086 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-systemd-units\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.869925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869084 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-slash\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869094 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4047cdbf-3d64-46a4-b565-550cd28e6eac-device-dir\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869100 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-run-ovn-kubernetes\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-cni-bin\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869143 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-systemd\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869147 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-multus-cni-dir\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869170 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bbe5211a-cea9-4848-93bc-eaa0e38f906d-ovnkube-script-lib\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-log-socket\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869197 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-run-netns\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869228 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-run-netns\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869230 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-kubernetes\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869270 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-kubernetes\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869301 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-tuned\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869325 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-modprobe-d\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-cni-bin\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-modprobe-d\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869408 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-sysctl-d\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869473 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-systemd-units\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.870615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869472 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-run-ovn-kubernetes\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.871432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a929b972-02c0-4e8a-b302-09406b1c441c-host\") pod \"node-ca-xkxjl\" (UID: \"a929b972-02c0-4e8a-b302-09406b1c441c\") " pod="openshift-image-registry/node-ca-xkxjl" Apr 22 16:21:42.871432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869511 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-sysctl-d\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.871432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869537 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a929b972-02c0-4e8a-b302-09406b1c441c-host\") pod \"node-ca-xkxjl\" (UID: \"a929b972-02c0-4e8a-b302-09406b1c441c\") " pod="openshift-image-registry/node-ca-xkxjl" Apr 22 16:21:42.871432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869547 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bbe5211a-cea9-4848-93bc-eaa0e38f906d-env-overrides\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.871432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vxcb\" (UniqueName: \"kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb\") pod \"network-check-target-qf7zg\" (UID: \"8c01b2f6-2652-4b52-88bd-aa16905c79db\") " pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:21:42.871432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869599 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxx6x\" (UniqueName: \"kubernetes.io/projected/09f37d35-30d1-4fc0-a88f-3514e6c16586-kube-api-access-gxx6x\") pod \"network-metrics-daemon-5wqw7\" (UID: \"09f37d35-30d1-4fc0-a88f-3514e6c16586\") " pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:42.871432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-etc-openvswitch\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.871432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869634 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bbe5211a-cea9-4848-93bc-eaa0e38f906d-ovn-node-metrics-cert\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.871432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869656 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-var-lib-kubelet\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.871432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869671 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-host\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.871432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869675 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bbe5211a-cea9-4848-93bc-eaa0e38f906d-ovnkube-script-lib\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.871432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869691 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/df5633bf-88a8-430f-b582-1e8a7a03005c-agent-certs\") pod \"konnectivity-agent-4c4gx\" (UID: \"df5633bf-88a8-430f-b582-1e8a7a03005c\") " pod="kube-system/konnectivity-agent-4c4gx" Apr 22 16:21:42.871432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869709 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/020a35c8-d40b-477c-8c6e-1530096b3f1a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.871432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869729 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-run-netns\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.871432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869749 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-run-k8s-cni-cncf-io\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.871432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869764 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-var-lib-kubelet\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.871432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/df5633bf-88a8-430f-b582-1e8a7a03005c-konnectivity-ca\") pod \"konnectivity-agent-4c4gx\" (UID: \"df5633bf-88a8-430f-b582-1e8a7a03005c\") " pod="kube-system/konnectivity-agent-4c4gx" Apr 22 16:21:42.872156 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869794 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3f7797ac-2216-4ec7-b9fa-f20eb6f39230-host-slash\") pod \"iptables-alerter-cjtpq\" (UID: \"3f7797ac-2216-4ec7-b9fa-f20eb6f39230\") " pod="openshift-network-operator/iptables-alerter-cjtpq" Apr 22 16:21:42.872156 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/020a35c8-d40b-477c-8c6e-1530096b3f1a-os-release\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.872156 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/020a35c8-d40b-477c-8c6e-1530096b3f1a-cni-binary-copy\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.872156 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bbe5211a-cea9-4848-93bc-eaa0e38f906d-env-overrides\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.872156 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869918 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-var-lib-kubelet\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.872156 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.869962 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-host\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.872156 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.870064 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b508f921-8bf7-4ed5-858a-04f8cf475055-host-run-k8s-cni-cncf-io\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.872156 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.870080 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-etc-openvswitch\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.872156 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.870144 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bbe5211a-cea9-4848-93bc-eaa0e38f906d-host-run-netns\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.872156 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.870144 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/020a35c8-d40b-477c-8c6e-1530096b3f1a-os-release\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.872156 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.870170 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2124d5bf-41a3-4af2-ab62-5bec35d4e264-tmp\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.872156 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.870183 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2124d5bf-41a3-4af2-ab62-5bec35d4e264-var-lib-kubelet\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.872156 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.870229 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/020a35c8-d40b-477c-8c6e-1530096b3f1a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.872156 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.870381 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/020a35c8-d40b-477c-8c6e-1530096b3f1a-cni-binary-copy\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.872978 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.872956 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2124d5bf-41a3-4af2-ab62-5bec35d4e264-etc-tuned\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.873360 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.873338 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bbe5211a-cea9-4848-93bc-eaa0e38f906d-ovn-node-metrics-cert\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.877731 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:42.877706 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:42.877834 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:42.877736 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:42.877834 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:42.877749 2575 projected.go:194] Error preparing data for projected volume kube-api-access-2vxcb for pod openshift-network-diagnostics/network-check-target-qf7zg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:42.878551 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:42.878381 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb podName:8c01b2f6-2652-4b52-88bd-aa16905c79db nodeName:}" failed. No retries permitted until 2026-04-22 16:21:43.377805957 +0000 UTC m=+3.079824049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2vxcb" (UniqueName: "kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb") pod "network-check-target-qf7zg" (UID: "8c01b2f6-2652-4b52-88bd-aa16905c79db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:42.879463 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.879397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6hdk\" (UniqueName: \"kubernetes.io/projected/a929b972-02c0-4e8a-b302-09406b1c441c-kube-api-access-c6hdk\") pod \"node-ca-xkxjl\" (UID: \"a929b972-02c0-4e8a-b302-09406b1c441c\") " pod="openshift-image-registry/node-ca-xkxjl" Apr 22 16:21:42.879553 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.879473 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p57nt\" (UniqueName: \"kubernetes.io/projected/b508f921-8bf7-4ed5-858a-04f8cf475055-kube-api-access-p57nt\") pod \"multus-d5sxk\" (UID: \"b508f921-8bf7-4ed5-858a-04f8cf475055\") " pod="openshift-multus/multus-d5sxk" Apr 22 16:21:42.880132 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.880110 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdv4c\" (UniqueName: \"kubernetes.io/projected/4047cdbf-3d64-46a4-b565-550cd28e6eac-kube-api-access-wdv4c\") pod \"aws-ebs-csi-driver-node-ng4b7\" (UID: \"4047cdbf-3d64-46a4-b565-550cd28e6eac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:42.881480 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.881461 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhpj6\" (UniqueName: \"kubernetes.io/projected/91576000-c254-43f5-84ba-7029c347da22-kube-api-access-lhpj6\") pod \"node-resolver-jg6wx\" (UID: \"91576000-c254-43f5-84ba-7029c347da22\") " pod="openshift-dns/node-resolver-jg6wx" Apr 22 16:21:42.881584 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.881555 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkhzg\" (UniqueName: \"kubernetes.io/projected/2124d5bf-41a3-4af2-ab62-5bec35d4e264-kube-api-access-pkhzg\") pod \"tuned-zkfjr\" (UID: \"2124d5bf-41a3-4af2-ab62-5bec35d4e264\") " pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:42.882321 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.882303 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxx6x\" (UniqueName: \"kubernetes.io/projected/09f37d35-30d1-4fc0-a88f-3514e6c16586-kube-api-access-gxx6x\") pod \"network-metrics-daemon-5wqw7\" (UID: \"09f37d35-30d1-4fc0-a88f-3514e6c16586\") " pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:42.883409 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.883385 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46h59\" (UniqueName: \"kubernetes.io/projected/bbe5211a-cea9-4848-93bc-eaa0e38f906d-kube-api-access-46h59\") pod \"ovnkube-node-pxgm2\" (UID: \"bbe5211a-cea9-4848-93bc-eaa0e38f906d\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:42.886084 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.886065 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rks4\" (UniqueName: \"kubernetes.io/projected/020a35c8-d40b-477c-8c6e-1530096b3f1a-kube-api-access-4rks4\") pod \"multus-additional-cni-plugins-4gkbb\" (UID: \"020a35c8-d40b-477c-8c6e-1530096b3f1a\") " pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:42.904483 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.904437 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal" event={"ID":"292a4af51785efd6d8e1973dc2f1a57e","Type":"ContainerStarted","Data":"60fc3d821bfd48502c4636d7d930588ea19d5bf8afb9292ef0824f85c1ae5c44"} Apr 22 16:21:42.905515 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.905481 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-238.ec2.internal" event={"ID":"cce540b655be06a7bb42ab4ffff03c36","Type":"ContainerStarted","Data":"ab917b73309ce44bd0c692d0730e623887ec328fdd6701e11d871a7f8d60ff7a"} Apr 22 16:21:42.970276 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.970244 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/df5633bf-88a8-430f-b582-1e8a7a03005c-agent-certs\") pod \"konnectivity-agent-4c4gx\" (UID: \"df5633bf-88a8-430f-b582-1e8a7a03005c\") " pod="kube-system/konnectivity-agent-4c4gx" Apr 22 16:21:42.970276 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.970279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/df5633bf-88a8-430f-b582-1e8a7a03005c-konnectivity-ca\") pod \"konnectivity-agent-4c4gx\" (UID: \"df5633bf-88a8-430f-b582-1e8a7a03005c\") " pod="kube-system/konnectivity-agent-4c4gx" Apr 22 16:21:42.970496 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.970294 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3f7797ac-2216-4ec7-b9fa-f20eb6f39230-host-slash\") pod \"iptables-alerter-cjtpq\" (UID: \"3f7797ac-2216-4ec7-b9fa-f20eb6f39230\") " pod="openshift-network-operator/iptables-alerter-cjtpq" Apr 22 16:21:42.970496 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.970350 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3f7797ac-2216-4ec7-b9fa-f20eb6f39230-iptables-alerter-script\") pod \"iptables-alerter-cjtpq\" (UID: \"3f7797ac-2216-4ec7-b9fa-f20eb6f39230\") " pod="openshift-network-operator/iptables-alerter-cjtpq" Apr 22 16:21:42.970496 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.970371 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfzr6\" (UniqueName: \"kubernetes.io/projected/3f7797ac-2216-4ec7-b9fa-f20eb6f39230-kube-api-access-sfzr6\") pod \"iptables-alerter-cjtpq\" (UID: \"3f7797ac-2216-4ec7-b9fa-f20eb6f39230\") " pod="openshift-network-operator/iptables-alerter-cjtpq" Apr 22 16:21:42.970496 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.970447 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3f7797ac-2216-4ec7-b9fa-f20eb6f39230-host-slash\") pod \"iptables-alerter-cjtpq\" (UID: \"3f7797ac-2216-4ec7-b9fa-f20eb6f39230\") " pod="openshift-network-operator/iptables-alerter-cjtpq" Apr 22 16:21:42.970956 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.970932 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/df5633bf-88a8-430f-b582-1e8a7a03005c-konnectivity-ca\") pod \"konnectivity-agent-4c4gx\" (UID: \"df5633bf-88a8-430f-b582-1e8a7a03005c\") " pod="kube-system/konnectivity-agent-4c4gx" Apr 22 16:21:42.971057 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.970931 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3f7797ac-2216-4ec7-b9fa-f20eb6f39230-iptables-alerter-script\") pod \"iptables-alerter-cjtpq\" (UID: \"3f7797ac-2216-4ec7-b9fa-f20eb6f39230\") " pod="openshift-network-operator/iptables-alerter-cjtpq" Apr 22 16:21:42.973132 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.973111 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/df5633bf-88a8-430f-b582-1e8a7a03005c-agent-certs\") pod \"konnectivity-agent-4c4gx\" (UID: \"df5633bf-88a8-430f-b582-1e8a7a03005c\") " pod="kube-system/konnectivity-agent-4c4gx" Apr 22 16:21:42.977837 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:42.977784 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfzr6\" (UniqueName: \"kubernetes.io/projected/3f7797ac-2216-4ec7-b9fa-f20eb6f39230-kube-api-access-sfzr6\") pod \"iptables-alerter-cjtpq\" (UID: \"3f7797ac-2216-4ec7-b9fa-f20eb6f39230\") " pod="openshift-network-operator/iptables-alerter-cjtpq" Apr 22 16:21:43.051852 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.051817 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4gkbb" Apr 22 16:21:43.059787 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.059762 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d5sxk" Apr 22 16:21:43.071471 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.071447 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" Apr 22 16:21:43.076173 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.076143 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:21:43.083763 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.083742 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jg6wx" Apr 22 16:21:43.091416 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.091397 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" Apr 22 16:21:43.099049 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.099016 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xkxjl" Apr 22 16:21:43.105623 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.105603 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4c4gx" Apr 22 16:21:43.110190 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.110170 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cjtpq" Apr 22 16:21:43.247848 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.247767 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:43.373453 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.373417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs\") pod \"network-metrics-daemon-5wqw7\" (UID: \"09f37d35-30d1-4fc0-a88f-3514e6c16586\") " pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:43.373617 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:43.373567 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:43.373669 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:43.373637 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs podName:09f37d35-30d1-4fc0-a88f-3514e6c16586 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:44.373617029 +0000 UTC m=+4.075635118 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs") pod "network-metrics-daemon-5wqw7" (UID: "09f37d35-30d1-4fc0-a88f-3514e6c16586") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:43.474637 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.474602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vxcb\" (UniqueName: \"kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb\") pod \"network-check-target-qf7zg\" (UID: \"8c01b2f6-2652-4b52-88bd-aa16905c79db\") " pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:21:43.474788 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:43.474742 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:43.474788 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:43.474757 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:43.474788 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:43.474778 2575 projected.go:194] Error preparing data for projected volume kube-api-access-2vxcb for pod openshift-network-diagnostics/network-check-target-qf7zg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:43.474923 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:43.474832 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb podName:8c01b2f6-2652-4b52-88bd-aa16905c79db nodeName:}" failed. No retries permitted until 2026-04-22 16:21:44.474815975 +0000 UTC m=+4.176834062 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2vxcb" (UniqueName: "kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb") pod "network-check-target-qf7zg" (UID: "8c01b2f6-2652-4b52-88bd-aa16905c79db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:43.571343 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:43.571292 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbe5211a_cea9_4848_93bc_eaa0e38f906d.slice/crio-f92fccec0aac2a577043d1e6b42320886aee4d67dc92152ef5b329c78ae11495 WatchSource:0}: Error finding container f92fccec0aac2a577043d1e6b42320886aee4d67dc92152ef5b329c78ae11495: Status 404 returned error can't find the container with id f92fccec0aac2a577043d1e6b42320886aee4d67dc92152ef5b329c78ae11495 Apr 22 16:21:43.573423 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:43.573360 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f7797ac_2216_4ec7_b9fa_f20eb6f39230.slice/crio-19f61c2a740abe5c774432884ac4c76953881601b7214d14c1148d50391d97c3 WatchSource:0}: Error finding container 19f61c2a740abe5c774432884ac4c76953881601b7214d14c1148d50391d97c3: Status 404 returned error can't find the container with id 19f61c2a740abe5c774432884ac4c76953881601b7214d14c1148d50391d97c3 Apr 22 16:21:43.575180 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:43.574714 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2124d5bf_41a3_4af2_ab62_5bec35d4e264.slice/crio-08503b6bba1ae4935f6437884200ca4012c4ccaed96e44f99c94cdad0cc6f0be WatchSource:0}: Error finding container 08503b6bba1ae4935f6437884200ca4012c4ccaed96e44f99c94cdad0cc6f0be: Status 404 returned error can't find the container with id 08503b6bba1ae4935f6437884200ca4012c4ccaed96e44f99c94cdad0cc6f0be Apr 22 16:21:43.577368 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:43.577345 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91576000_c254_43f5_84ba_7029c347da22.slice/crio-b417d2401af0484282da1ac953b89f2c98a83dc709e54004921888edd1bf3700 WatchSource:0}: Error finding container b417d2401af0484282da1ac953b89f2c98a83dc709e54004921888edd1bf3700: Status 404 returned error can't find the container with id b417d2401af0484282da1ac953b89f2c98a83dc709e54004921888edd1bf3700 Apr 22 16:21:43.577766 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:43.577663 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4047cdbf_3d64_46a4_b565_550cd28e6eac.slice/crio-ac38dca52073694d56fa0bbf409c5507c84d1d02b1946c6f0d71a906fbe1b784 WatchSource:0}: Error finding container ac38dca52073694d56fa0bbf409c5507c84d1d02b1946c6f0d71a906fbe1b784: Status 404 returned error can't find the container with id ac38dca52073694d56fa0bbf409c5507c84d1d02b1946c6f0d71a906fbe1b784 Apr 22 16:21:43.580268 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:43.578949 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf5633bf_88a8_430f_b582_1e8a7a03005c.slice/crio-29fd3258e0cc0a07c9aaca669355e5acec266677f4b6adfb3c0a4d99f3259489 WatchSource:0}: Error finding container 29fd3258e0cc0a07c9aaca669355e5acec266677f4b6adfb3c0a4d99f3259489: Status 404 returned error can't find the container with id 29fd3258e0cc0a07c9aaca669355e5acec266677f4b6adfb3c0a4d99f3259489 Apr 22 16:21:43.580268 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:43.579471 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb508f921_8bf7_4ed5_858a_04f8cf475055.slice/crio-f6478187f21b22b268247d4524544913387d11c34a05ec67a7eae5000194e5d4 WatchSource:0}: Error finding container f6478187f21b22b268247d4524544913387d11c34a05ec67a7eae5000194e5d4: Status 404 returned error can't find the container with id f6478187f21b22b268247d4524544913387d11c34a05ec67a7eae5000194e5d4 Apr 22 16:21:43.580910 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:43.580590 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda929b972_02c0_4e8a_b302_09406b1c441c.slice/crio-d0ec41a75f3f5dbc838d1bea16572fd40fdd9e08316c758494837d9c248d9137 WatchSource:0}: Error finding container d0ec41a75f3f5dbc838d1bea16572fd40fdd9e08316c758494837d9c248d9137: Status 404 returned error can't find the container with id d0ec41a75f3f5dbc838d1bea16572fd40fdd9e08316c758494837d9c248d9137 Apr 22 16:21:43.583142 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:21:43.583004 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod020a35c8_d40b_477c_8c6e_1530096b3f1a.slice/crio-cfc525161691503d22ab8fcf4bef468d271943628608fd48ccf607ca918eb958 WatchSource:0}: Error finding container cfc525161691503d22ab8fcf4bef468d271943628608fd48ccf607ca918eb958: Status 404 returned error can't find the container with id cfc525161691503d22ab8fcf4bef468d271943628608fd48ccf607ca918eb958 Apr 22 16:21:43.799938 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.799774 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 16:16:41 +0000 UTC" deadline="2027-10-15 12:38:24.329369853 +0000 UTC" Apr 22 16:21:43.799938 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.799933 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12980h16m40.529439663s" Apr 22 16:21:43.900205 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.900131 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:43.900332 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:43.900244 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:21:43.908578 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.908545 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" event={"ID":"2124d5bf-41a3-4af2-ab62-5bec35d4e264","Type":"ContainerStarted","Data":"08503b6bba1ae4935f6437884200ca4012c4ccaed96e44f99c94cdad0cc6f0be"} Apr 22 16:21:43.909683 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.909652 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" event={"ID":"bbe5211a-cea9-4848-93bc-eaa0e38f906d","Type":"ContainerStarted","Data":"f92fccec0aac2a577043d1e6b42320886aee4d67dc92152ef5b329c78ae11495"} Apr 22 16:21:43.910622 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.910602 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4gkbb" event={"ID":"020a35c8-d40b-477c-8c6e-1530096b3f1a","Type":"ContainerStarted","Data":"cfc525161691503d22ab8fcf4bef468d271943628608fd48ccf607ca918eb958"} Apr 22 16:21:43.911589 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.911565 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jg6wx" event={"ID":"91576000-c254-43f5-84ba-7029c347da22","Type":"ContainerStarted","Data":"b417d2401af0484282da1ac953b89f2c98a83dc709e54004921888edd1bf3700"} Apr 22 16:21:43.912569 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.912544 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cjtpq" event={"ID":"3f7797ac-2216-4ec7-b9fa-f20eb6f39230","Type":"ContainerStarted","Data":"19f61c2a740abe5c774432884ac4c76953881601b7214d14c1148d50391d97c3"} Apr 22 16:21:43.914457 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.914427 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-238.ec2.internal" event={"ID":"cce540b655be06a7bb42ab4ffff03c36","Type":"ContainerStarted","Data":"835995c1c3327c24b6a160609041247a4d196437a9ce8c13e7037d9ee757ea34"} Apr 22 16:21:43.915540 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.915516 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xkxjl" event={"ID":"a929b972-02c0-4e8a-b302-09406b1c441c","Type":"ContainerStarted","Data":"d0ec41a75f3f5dbc838d1bea16572fd40fdd9e08316c758494837d9c248d9137"} Apr 22 16:21:43.916644 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.916608 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5sxk" event={"ID":"b508f921-8bf7-4ed5-858a-04f8cf475055","Type":"ContainerStarted","Data":"f6478187f21b22b268247d4524544913387d11c34a05ec67a7eae5000194e5d4"} Apr 22 16:21:43.917873 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.917851 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4c4gx" event={"ID":"df5633bf-88a8-430f-b582-1e8a7a03005c","Type":"ContainerStarted","Data":"29fd3258e0cc0a07c9aaca669355e5acec266677f4b6adfb3c0a4d99f3259489"} Apr 22 16:21:43.918981 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.918959 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" event={"ID":"4047cdbf-3d64-46a4-b565-550cd28e6eac","Type":"ContainerStarted","Data":"ac38dca52073694d56fa0bbf409c5507c84d1d02b1946c6f0d71a906fbe1b784"} Apr 22 16:21:43.927471 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:43.927427 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-238.ec2.internal" podStartSLOduration=1.9274119509999998 podStartE2EDuration="1.927411951s" podCreationTimestamp="2026-04-22 16:21:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:21:43.927212818 +0000 UTC m=+3.629230940" watchObservedRunningTime="2026-04-22 16:21:43.927411951 +0000 UTC m=+3.629430061" Apr 22 16:21:44.380418 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:44.379760 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs\") pod \"network-metrics-daemon-5wqw7\" (UID: \"09f37d35-30d1-4fc0-a88f-3514e6c16586\") " pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:44.380418 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:44.379925 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:44.380418 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:44.379987 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs podName:09f37d35-30d1-4fc0-a88f-3514e6c16586 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:46.379968279 +0000 UTC m=+6.081986373 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs") pod "network-metrics-daemon-5wqw7" (UID: "09f37d35-30d1-4fc0-a88f-3514e6c16586") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:44.482364 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:44.481710 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vxcb\" (UniqueName: \"kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb\") pod \"network-check-target-qf7zg\" (UID: \"8c01b2f6-2652-4b52-88bd-aa16905c79db\") " pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:21:44.482364 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:44.481903 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:44.482364 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:44.481925 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:44.482364 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:44.481938 2575 projected.go:194] Error preparing data for projected volume kube-api-access-2vxcb for pod openshift-network-diagnostics/network-check-target-qf7zg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:44.482364 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:44.481994 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb podName:8c01b2f6-2652-4b52-88bd-aa16905c79db nodeName:}" failed. No retries permitted until 2026-04-22 16:21:46.481975481 +0000 UTC m=+6.183993573 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2vxcb" (UniqueName: "kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb") pod "network-check-target-qf7zg" (UID: "8c01b2f6-2652-4b52-88bd-aa16905c79db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:44.903109 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:44.903077 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:21:44.903535 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:44.903200 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qf7zg" podUID="8c01b2f6-2652-4b52-88bd-aa16905c79db" Apr 22 16:21:44.934714 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:44.934523 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal" event={"ID":"292a4af51785efd6d8e1973dc2f1a57e","Type":"ContainerStarted","Data":"8a240423e7eaad432726ffe5eea17867ad3ed5aaa9189bcf63e40e58a007fb4c"} Apr 22 16:21:45.900735 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:45.900703 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:45.900927 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:45.900837 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:21:45.937350 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:45.937312 2575 generic.go:358] "Generic (PLEG): container finished" podID="292a4af51785efd6d8e1973dc2f1a57e" containerID="8a240423e7eaad432726ffe5eea17867ad3ed5aaa9189bcf63e40e58a007fb4c" exitCode=0 Apr 22 16:21:45.937756 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:45.937372 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal" event={"ID":"292a4af51785efd6d8e1973dc2f1a57e","Type":"ContainerDied","Data":"8a240423e7eaad432726ffe5eea17867ad3ed5aaa9189bcf63e40e58a007fb4c"} Apr 22 16:21:46.398801 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:46.398712 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs\") pod \"network-metrics-daemon-5wqw7\" (UID: \"09f37d35-30d1-4fc0-a88f-3514e6c16586\") " pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:46.399001 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:46.398885 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:46.399001 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:46.398947 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs podName:09f37d35-30d1-4fc0-a88f-3514e6c16586 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:50.39892866 +0000 UTC m=+10.100946753 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs") pod "network-metrics-daemon-5wqw7" (UID: "09f37d35-30d1-4fc0-a88f-3514e6c16586") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:46.499312 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:46.499275 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vxcb\" (UniqueName: \"kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb\") pod \"network-check-target-qf7zg\" (UID: \"8c01b2f6-2652-4b52-88bd-aa16905c79db\") " pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:21:46.499542 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:46.499447 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:46.499542 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:46.499466 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:46.499542 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:46.499477 2575 projected.go:194] Error preparing data for projected volume kube-api-access-2vxcb for pod openshift-network-diagnostics/network-check-target-qf7zg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:46.499542 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:46.499533 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb podName:8c01b2f6-2652-4b52-88bd-aa16905c79db nodeName:}" failed. No retries permitted until 2026-04-22 16:21:50.499515096 +0000 UTC m=+10.201533183 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2vxcb" (UniqueName: "kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb") pod "network-check-target-qf7zg" (UID: "8c01b2f6-2652-4b52-88bd-aa16905c79db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:46.901002 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:46.900972 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:21:46.901243 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:46.901136 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qf7zg" podUID="8c01b2f6-2652-4b52-88bd-aa16905c79db" Apr 22 16:21:47.900777 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:47.900692 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:47.901318 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:47.900836 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:21:48.900743 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:48.900715 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:21:48.900895 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:48.900816 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qf7zg" podUID="8c01b2f6-2652-4b52-88bd-aa16905c79db" Apr 22 16:21:49.901413 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:49.901381 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:49.901806 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:49.901554 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:21:50.433174 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:50.433143 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs\") pod \"network-metrics-daemon-5wqw7\" (UID: \"09f37d35-30d1-4fc0-a88f-3514e6c16586\") " pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:50.433362 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:50.433288 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:50.433362 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:50.433340 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs podName:09f37d35-30d1-4fc0-a88f-3514e6c16586 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:58.433326443 +0000 UTC m=+18.135344530 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs") pod "network-metrics-daemon-5wqw7" (UID: "09f37d35-30d1-4fc0-a88f-3514e6c16586") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:50.534336 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:50.534283 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vxcb\" (UniqueName: \"kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb\") pod \"network-check-target-qf7zg\" (UID: \"8c01b2f6-2652-4b52-88bd-aa16905c79db\") " pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:21:50.534527 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:50.534466 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:50.534664 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:50.534545 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:50.534664 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:50.534560 2575 projected.go:194] Error preparing data for projected volume kube-api-access-2vxcb for pod openshift-network-diagnostics/network-check-target-qf7zg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:50.534664 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:50.534641 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb podName:8c01b2f6-2652-4b52-88bd-aa16905c79db nodeName:}" failed. No retries permitted until 2026-04-22 16:21:58.534621615 +0000 UTC m=+18.236639718 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-2vxcb" (UniqueName: "kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb") pod "network-check-target-qf7zg" (UID: "8c01b2f6-2652-4b52-88bd-aa16905c79db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:50.901841 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:50.901810 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:21:50.902211 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:50.901918 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qf7zg" podUID="8c01b2f6-2652-4b52-88bd-aa16905c79db" Apr 22 16:21:51.900965 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:51.900923 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:51.901178 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:51.901088 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:21:52.900821 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:52.900737 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:21:52.901226 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:52.900864 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qf7zg" podUID="8c01b2f6-2652-4b52-88bd-aa16905c79db" Apr 22 16:21:53.901027 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:53.900991 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:53.901457 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:53.901143 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:21:54.900989 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:54.900957 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:21:54.901185 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:54.901096 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qf7zg" podUID="8c01b2f6-2652-4b52-88bd-aa16905c79db" Apr 22 16:21:55.900887 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:55.900835 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:55.901101 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:55.900983 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:21:56.900856 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:56.900823 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:21:56.901308 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:56.900944 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qf7zg" podUID="8c01b2f6-2652-4b52-88bd-aa16905c79db" Apr 22 16:21:57.900433 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:57.900404 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:57.900598 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:57.900572 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:21:58.491633 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:58.491598 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs\") pod \"network-metrics-daemon-5wqw7\" (UID: \"09f37d35-30d1-4fc0-a88f-3514e6c16586\") " pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:58.492104 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:58.491745 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:58.492104 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:58.491808 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs podName:09f37d35-30d1-4fc0-a88f-3514e6c16586 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:14.491792664 +0000 UTC m=+34.193810751 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs") pod "network-metrics-daemon-5wqw7" (UID: "09f37d35-30d1-4fc0-a88f-3514e6c16586") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:58.592457 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:58.592423 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vxcb\" (UniqueName: \"kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb\") pod \"network-check-target-qf7zg\" (UID: \"8c01b2f6-2652-4b52-88bd-aa16905c79db\") " pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:21:58.592622 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:58.592557 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:58.592622 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:58.592577 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:58.592622 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:58.592588 2575 projected.go:194] Error preparing data for projected volume kube-api-access-2vxcb for pod openshift-network-diagnostics/network-check-target-qf7zg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:58.592771 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:58.592649 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb podName:8c01b2f6-2652-4b52-88bd-aa16905c79db nodeName:}" failed. No retries permitted until 2026-04-22 16:22:14.592631264 +0000 UTC m=+34.294649371 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-2vxcb" (UniqueName: "kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb") pod "network-check-target-qf7zg" (UID: "8c01b2f6-2652-4b52-88bd-aa16905c79db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:58.900836 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:58.900772 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:21:58.901073 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:58.900874 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qf7zg" podUID="8c01b2f6-2652-4b52-88bd-aa16905c79db" Apr 22 16:21:59.900663 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:21:59.900632 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:21:59.900989 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:21:59.900752 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:22:00.900994 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:00.900829 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:22:00.901616 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:00.901089 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qf7zg" podUID="8c01b2f6-2652-4b52-88bd-aa16905c79db" Apr 22 16:22:00.964118 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:00.964085 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal" event={"ID":"292a4af51785efd6d8e1973dc2f1a57e","Type":"ContainerStarted","Data":"f33bcf84a4604edb5baf1dc9fb07debc8f17a8d7ac404c2766037e7828e1f404"} Apr 22 16:22:00.966015 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:00.965983 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xkxjl" event={"ID":"a929b972-02c0-4e8a-b302-09406b1c441c","Type":"ContainerStarted","Data":"cd95a9ddb3319168afe07f1d298eabeb7710684112a404cdec0db3e952b1da86"} Apr 22 16:22:00.968748 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:00.968715 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d5sxk" event={"ID":"b508f921-8bf7-4ed5-858a-04f8cf475055","Type":"ContainerStarted","Data":"193731ded2e3d8ad6e5afd6e5721e600611a70b00994f17d6741665619159e24"} Apr 22 16:22:00.969912 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:00.969889 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4c4gx" event={"ID":"df5633bf-88a8-430f-b582-1e8a7a03005c","Type":"ContainerStarted","Data":"39b4c839d59c30d4b58163bb8ce0cbacc8862ec5659bfceb8fa0ca430dfd5b74"} Apr 22 16:22:00.971144 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:00.971116 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" event={"ID":"4047cdbf-3d64-46a4-b565-550cd28e6eac","Type":"ContainerStarted","Data":"f6f1912f3ee7b51ac59855e25a79c050e2807457ed74c8fda52ee884c41a55d9"} Apr 22 16:22:00.972292 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:00.972267 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" event={"ID":"2124d5bf-41a3-4af2-ab62-5bec35d4e264","Type":"ContainerStarted","Data":"d79d695233348b5836f66da2e6b522c42dfc4ebd84d8de43aa411db2a97c105f"} Apr 22 16:22:00.973666 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:00.973649 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:22:00.973927 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:00.973910 2575 generic.go:358] "Generic (PLEG): container finished" podID="bbe5211a-cea9-4848-93bc-eaa0e38f906d" containerID="d8fc73b92a94f1237e766a2df886f32415e8db57e4962923c726a11e6c693f23" exitCode=1 Apr 22 16:22:00.974003 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:00.973956 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" event={"ID":"bbe5211a-cea9-4848-93bc-eaa0e38f906d","Type":"ContainerStarted","Data":"c487d7452b24e49cc7b5a4ce14377e1cb80099ba87f1ce0f312081201828999c"} Apr 22 16:22:00.974003 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:00.973969 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" event={"ID":"bbe5211a-cea9-4848-93bc-eaa0e38f906d","Type":"ContainerDied","Data":"d8fc73b92a94f1237e766a2df886f32415e8db57e4962923c726a11e6c693f23"} Apr 22 16:22:00.974003 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:00.973979 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" event={"ID":"bbe5211a-cea9-4848-93bc-eaa0e38f906d","Type":"ContainerStarted","Data":"35f7d5b7682a6abc61797ea88f064fa4ddc8d351ab3d9d58a7605a33ce5ad303"} Apr 22 16:22:00.975155 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:00.975136 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4gkbb" event={"ID":"020a35c8-d40b-477c-8c6e-1530096b3f1a","Type":"ContainerStarted","Data":"c30914fc8458dcf183c03bd9a3f0e483fddcbc2343c3e6ad0235f92da0410056"} Apr 22 16:22:00.976263 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:00.976248 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jg6wx" event={"ID":"91576000-c254-43f5-84ba-7029c347da22","Type":"ContainerStarted","Data":"1ccce1075c9bf123478a782b86659b97a3f55832878b8aa99935d93aee3123e0"} Apr 22 16:22:00.978808 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:00.978774 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-238.ec2.internal" podStartSLOduration=18.978764762 podStartE2EDuration="18.978764762s" podCreationTimestamp="2026-04-22 16:21:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:22:00.978397808 +0000 UTC m=+20.680415916" watchObservedRunningTime="2026-04-22 16:22:00.978764762 +0000 UTC m=+20.680782871" Apr 22 16:22:00.990444 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:00.990406 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jg6wx" podStartSLOduration=3.1783283 podStartE2EDuration="19.99039405s" podCreationTimestamp="2026-04-22 16:21:41 +0000 UTC" firstStartedPulling="2026-04-22 16:21:43.58018087 +0000 UTC m=+3.282198964" lastFinishedPulling="2026-04-22 16:22:00.392246626 +0000 UTC m=+20.094264714" observedRunningTime="2026-04-22 16:22:00.990248291 +0000 UTC m=+20.692266399" watchObservedRunningTime="2026-04-22 16:22:00.99039405 +0000 UTC m=+20.692412159" Apr 22 16:22:01.002794 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:01.002748 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xkxjl" podStartSLOduration=3.166892047 podStartE2EDuration="20.002734386s" podCreationTimestamp="2026-04-22 16:21:41 +0000 UTC" firstStartedPulling="2026-04-22 16:21:43.584184721 +0000 UTC m=+3.286202809" lastFinishedPulling="2026-04-22 16:22:00.420027049 +0000 UTC m=+20.122045148" observedRunningTime="2026-04-22 16:22:01.002375617 +0000 UTC m=+20.704393726" watchObservedRunningTime="2026-04-22 16:22:01.002734386 +0000 UTC m=+20.704752495" Apr 22 16:22:01.041742 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:01.041693 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-zkfjr" podStartSLOduration=4.226156116 podStartE2EDuration="21.041679883s" podCreationTimestamp="2026-04-22 16:21:40 +0000 UTC" firstStartedPulling="2026-04-22 16:21:43.576777717 +0000 UTC m=+3.278795816" lastFinishedPulling="2026-04-22 16:22:00.392301481 +0000 UTC m=+20.094319583" observedRunningTime="2026-04-22 16:22:01.041291506 +0000 UTC m=+20.743309614" watchObservedRunningTime="2026-04-22 16:22:01.041679883 +0000 UTC m=+20.743698030" Apr 22 16:22:01.057102 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:01.057061 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4c4gx" podStartSLOduration=11.15138291 podStartE2EDuration="20.057028104s" podCreationTimestamp="2026-04-22 16:21:41 +0000 UTC" firstStartedPulling="2026-04-22 16:21:43.581660834 +0000 UTC m=+3.283678934" lastFinishedPulling="2026-04-22 16:21:52.487306028 +0000 UTC m=+12.189324128" observedRunningTime="2026-04-22 16:22:01.056722687 +0000 UTC m=+20.758740799" watchObservedRunningTime="2026-04-22 16:22:01.057028104 +0000 UTC m=+20.759046210" Apr 22 16:22:01.076683 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:01.076609 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-d5sxk" podStartSLOduration=4.227426114 podStartE2EDuration="21.076590753s" podCreationTimestamp="2026-04-22 16:21:40 +0000 UTC" firstStartedPulling="2026-04-22 16:21:43.58137466 +0000 UTC m=+3.283392765" lastFinishedPulling="2026-04-22 16:22:00.430539317 +0000 UTC m=+20.132557404" observedRunningTime="2026-04-22 16:22:01.075750842 +0000 UTC m=+20.777768953" watchObservedRunningTime="2026-04-22 16:22:01.076590753 +0000 UTC m=+20.778608864" Apr 22 16:22:01.900748 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:01.900582 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:22:01.900894 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:01.900819 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:22:01.982489 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:01.982464 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:22:01.982989 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:01.982845 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" event={"ID":"bbe5211a-cea9-4848-93bc-eaa0e38f906d","Type":"ContainerStarted","Data":"1677c5c0a1886552661b6d26cfeffc29204701f12e4409025a90284f0235be5d"} Apr 22 16:22:01.982989 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:01.982878 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" event={"ID":"bbe5211a-cea9-4848-93bc-eaa0e38f906d","Type":"ContainerStarted","Data":"5012f4c2a652e031c608ad76dd0daf48a0e388113ac7fb213d33a1f5d5f3be3b"} Apr 22 16:22:01.982989 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:01.982893 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" event={"ID":"bbe5211a-cea9-4848-93bc-eaa0e38f906d","Type":"ContainerStarted","Data":"e4ff9ec432c824c5226888d1521a0b1193b48aa06b8901e5fa7179a3aa4c68ad"} Apr 22 16:22:01.984470 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:01.984447 2575 generic.go:358] "Generic (PLEG): container finished" podID="020a35c8-d40b-477c-8c6e-1530096b3f1a" containerID="c30914fc8458dcf183c03bd9a3f0e483fddcbc2343c3e6ad0235f92da0410056" exitCode=0 Apr 22 16:22:01.984573 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:01.984487 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4gkbb" event={"ID":"020a35c8-d40b-477c-8c6e-1530096b3f1a","Type":"ContainerDied","Data":"c30914fc8458dcf183c03bd9a3f0e483fddcbc2343c3e6ad0235f92da0410056"} Apr 22 16:22:02.078281 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:02.078256 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 16:22:02.828202 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:02.828100 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T16:22:02.078277967Z","UUID":"58a44095-a7ae-4245-ba7f-a4f5a674631b","Handler":null,"Name":"","Endpoint":""} Apr 22 16:22:02.831424 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:02.831390 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 16:22:02.831424 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:02.831421 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 16:22:02.900469 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:02.900437 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:22:02.900647 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:02.900544 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qf7zg" podUID="8c01b2f6-2652-4b52-88bd-aa16905c79db" Apr 22 16:22:02.988508 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:02.988469 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" event={"ID":"4047cdbf-3d64-46a4-b565-550cd28e6eac","Type":"ContainerStarted","Data":"eff446a4b6f66fa52f8a0343c5fb9590b0e349d1d5681eb145da2a35312e52dd"} Apr 22 16:22:02.990185 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:02.990157 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cjtpq" event={"ID":"3f7797ac-2216-4ec7-b9fa-f20eb6f39230","Type":"ContainerStarted","Data":"0e79dc23130c42fae39b8072455d7690e2cabef9df445275e47d98d65af21af8"} Apr 22 16:22:03.234163 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:03.233916 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4c4gx" Apr 22 16:22:03.234503 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:03.234483 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4c4gx" Apr 22 16:22:03.248713 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:03.248673 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-cjtpq" podStartSLOduration=5.404006102 podStartE2EDuration="22.248657153s" podCreationTimestamp="2026-04-22 16:21:41 +0000 UTC" firstStartedPulling="2026-04-22 16:21:43.575140851 +0000 UTC m=+3.277158938" lastFinishedPulling="2026-04-22 16:22:00.419791893 +0000 UTC m=+20.121809989" observedRunningTime="2026-04-22 16:22:03.010872987 +0000 UTC m=+22.712891095" watchObservedRunningTime="2026-04-22 16:22:03.248657153 +0000 UTC m=+22.950675329" Apr 22 16:22:03.900691 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:03.900652 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:22:03.900859 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:03.900777 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:22:03.993946 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:03.993914 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" event={"ID":"4047cdbf-3d64-46a4-b565-550cd28e6eac","Type":"ContainerStarted","Data":"0b4435942fbdc445e6ef0bd9f926e04a9ebd6059d62cfa10aaf357d227f3594b"} Apr 22 16:22:03.996714 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:03.996689 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:22:03.996983 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:03.996958 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" event={"ID":"bbe5211a-cea9-4848-93bc-eaa0e38f906d","Type":"ContainerStarted","Data":"227ab52ffdbb6f1a542cfe6a8f11cb5fba30ddad472152b5cb16f60fa30cc59e"} Apr 22 16:22:04.900181 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:04.900151 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:22:04.900311 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:04.900274 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qf7zg" podUID="8c01b2f6-2652-4b52-88bd-aa16905c79db" Apr 22 16:22:05.900940 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:05.900863 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:22:05.901614 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:05.900997 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:22:06.903220 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:06.903198 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:22:06.903730 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:06.903307 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qf7zg" podUID="8c01b2f6-2652-4b52-88bd-aa16905c79db" Apr 22 16:22:07.006149 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:07.006097 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:22:07.006530 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:07.006405 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" event={"ID":"bbe5211a-cea9-4848-93bc-eaa0e38f906d","Type":"ContainerStarted","Data":"e5826ae88346f84e3003cf095c15534fa1d0fbc0cde8acdc748195204115a340"} Apr 22 16:22:07.007015 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:07.006762 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:22:07.007297 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:07.007019 2575 scope.go:117] "RemoveContainer" containerID="d8fc73b92a94f1237e766a2df886f32415e8db57e4962923c726a11e6c693f23" Apr 22 16:22:07.021800 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:07.021782 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:22:07.036869 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:07.036827 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ng4b7" podStartSLOduration=7.578926754 podStartE2EDuration="27.036813322s" podCreationTimestamp="2026-04-22 16:21:40 +0000 UTC" firstStartedPulling="2026-04-22 16:21:43.580936103 +0000 UTC m=+3.282954193" lastFinishedPulling="2026-04-22 16:22:03.038822673 +0000 UTC m=+22.740840761" observedRunningTime="2026-04-22 16:22:04.011371393 +0000 UTC m=+23.713389503" watchObservedRunningTime="2026-04-22 16:22:07.036813322 +0000 UTC m=+26.738831433" Apr 22 16:22:07.900707 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:07.900529 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:22:07.900839 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:07.900782 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:22:08.010923 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:08.010896 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:22:08.011292 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:08.011221 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" event={"ID":"bbe5211a-cea9-4848-93bc-eaa0e38f906d","Type":"ContainerStarted","Data":"0aded3662a60b6a0cc3d59df5d69ebeae8055dd43f7dfd0d3db26a1c23463b88"} Apr 22 16:22:08.011462 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:08.011438 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:22:08.011608 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:08.011469 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:22:08.012967 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:08.012946 2575 generic.go:358] "Generic (PLEG): container finished" podID="020a35c8-d40b-477c-8c6e-1530096b3f1a" containerID="9fe22a6dbfeb67c6c2e94191ec096466bfab1c23806056f2ef1ad77354bb6b9b" exitCode=0 Apr 22 16:22:08.013064 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:08.012975 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4gkbb" event={"ID":"020a35c8-d40b-477c-8c6e-1530096b3f1a","Type":"ContainerDied","Data":"9fe22a6dbfeb67c6c2e94191ec096466bfab1c23806056f2ef1ad77354bb6b9b"} Apr 22 16:22:08.026591 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:08.026571 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:22:08.036221 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:08.036187 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" podStartSLOduration=11.120352654 podStartE2EDuration="28.036176104s" podCreationTimestamp="2026-04-22 16:21:40 +0000 UTC" firstStartedPulling="2026-04-22 16:21:43.573088584 +0000 UTC m=+3.275106676" lastFinishedPulling="2026-04-22 16:22:00.488912035 +0000 UTC m=+20.190930126" observedRunningTime="2026-04-22 16:22:08.034541069 +0000 UTC m=+27.736559178" watchObservedRunningTime="2026-04-22 16:22:08.036176104 +0000 UTC m=+27.738194211" Apr 22 16:22:08.881197 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:08.881164 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qf7zg"] Apr 22 16:22:08.881347 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:08.881305 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:22:08.881413 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:08.881390 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qf7zg" podUID="8c01b2f6-2652-4b52-88bd-aa16905c79db" Apr 22 16:22:08.883712 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:08.883692 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5wqw7"] Apr 22 16:22:08.883814 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:08.883784 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:22:08.883873 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:08.883858 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:22:09.873601 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:09.873400 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4c4gx" Apr 22 16:22:09.873943 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:09.873685 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 16:22:09.874189 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:09.874173 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4c4gx" Apr 22 16:22:10.018424 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:10.018392 2575 generic.go:358] "Generic (PLEG): container finished" podID="020a35c8-d40b-477c-8c6e-1530096b3f1a" containerID="97895be7bc6c3cac4266753bafe08cdde93e6ec846dbe875ea5db49e74b51e6f" exitCode=0 Apr 22 16:22:10.018655 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:10.018465 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4gkbb" event={"ID":"020a35c8-d40b-477c-8c6e-1530096b3f1a","Type":"ContainerDied","Data":"97895be7bc6c3cac4266753bafe08cdde93e6ec846dbe875ea5db49e74b51e6f"} Apr 22 16:22:10.902679 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:10.902654 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:22:10.903117 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:10.902692 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:22:10.903117 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:10.902771 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:22:10.903117 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:10.902801 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qf7zg" podUID="8c01b2f6-2652-4b52-88bd-aa16905c79db" Apr 22 16:22:12.024313 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:12.024279 2575 generic.go:358] "Generic (PLEG): container finished" podID="020a35c8-d40b-477c-8c6e-1530096b3f1a" containerID="ab84c75fb473fae2147a6a0521ad39ce8d1a0360cd9667e93d4c26083f4a1496" exitCode=0 Apr 22 16:22:12.024763 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:12.024341 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4gkbb" event={"ID":"020a35c8-d40b-477c-8c6e-1530096b3f1a","Type":"ContainerDied","Data":"ab84c75fb473fae2147a6a0521ad39ce8d1a0360cd9667e93d4c26083f4a1496"} Apr 22 16:22:12.900401 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:12.900366 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:22:12.900579 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:12.900491 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qf7zg" podUID="8c01b2f6-2652-4b52-88bd-aa16905c79db" Apr 22 16:22:12.900579 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:12.900551 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:22:12.900700 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:12.900674 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:22:13.640715 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.640639 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-238.ec2.internal" event="NodeReady" Apr 22 16:22:13.641232 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.640755 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 16:22:13.684101 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.684072 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8zkfn"] Apr 22 16:22:13.704222 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.704194 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r9mm9"] Apr 22 16:22:13.704520 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.704493 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:13.708060 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.708021 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 16:22:13.708060 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.708055 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fsrq7\"" Apr 22 16:22:13.708227 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.708021 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 16:22:13.718949 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.718929 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8zkfn"] Apr 22 16:22:13.719058 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.718954 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r9mm9"] Apr 22 16:22:13.719153 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.719077 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r9mm9" Apr 22 16:22:13.721719 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.721701 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 16:22:13.721822 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.721706 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 16:22:13.721981 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.721967 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 16:22:13.722061 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.721997 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pdvf2\"" Apr 22 16:22:13.808559 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.808527 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:13.808706 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.808583 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert\") pod \"ingress-canary-r9mm9\" (UID: \"3ccac1e5-a013-4728-8544-cd8df005a479\") " pod="openshift-ingress-canary/ingress-canary-r9mm9" Apr 22 16:22:13.808706 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.808611 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-tmp-dir\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:13.808706 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.808659 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g488x\" (UniqueName: \"kubernetes.io/projected/3ccac1e5-a013-4728-8544-cd8df005a479-kube-api-access-g488x\") pod \"ingress-canary-r9mm9\" (UID: \"3ccac1e5-a013-4728-8544-cd8df005a479\") " pod="openshift-ingress-canary/ingress-canary-r9mm9" Apr 22 16:22:13.808850 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.808705 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8grtb\" (UniqueName: \"kubernetes.io/projected/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-kube-api-access-8grtb\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:13.808850 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.808745 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-config-volume\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:13.909346 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.909265 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8grtb\" (UniqueName: \"kubernetes.io/projected/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-kube-api-access-8grtb\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:13.909346 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.909311 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-config-volume\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:13.909535 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.909369 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:13.909535 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.909413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert\") pod \"ingress-canary-r9mm9\" (UID: \"3ccac1e5-a013-4728-8544-cd8df005a479\") " pod="openshift-ingress-canary/ingress-canary-r9mm9" Apr 22 16:22:13.909535 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.909435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-tmp-dir\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:13.909535 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.909464 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g488x\" (UniqueName: \"kubernetes.io/projected/3ccac1e5-a013-4728-8544-cd8df005a479-kube-api-access-g488x\") pod \"ingress-canary-r9mm9\" (UID: \"3ccac1e5-a013-4728-8544-cd8df005a479\") " pod="openshift-ingress-canary/ingress-canary-r9mm9" Apr 22 16:22:13.909713 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:13.909691 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:22:13.909773 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:13.909761 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert podName:3ccac1e5-a013-4728-8544-cd8df005a479 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:14.409741888 +0000 UTC m=+34.111759975 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert") pod "ingress-canary-r9mm9" (UID: "3ccac1e5-a013-4728-8544-cd8df005a479") : secret "canary-serving-cert" not found Apr 22 16:22:13.909926 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.909905 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-tmp-dir\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:13.909992 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:13.909978 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:22:13.910075 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:13.910065 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls podName:e3238d38-c8d6-423c-bfa5-3feb9c21e8bc nodeName:}" failed. No retries permitted until 2026-04-22 16:22:14.410026817 +0000 UTC m=+34.112044907 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls") pod "dns-default-8zkfn" (UID: "e3238d38-c8d6-423c-bfa5-3feb9c21e8bc") : secret "dns-default-metrics-tls" not found Apr 22 16:22:13.920108 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.920081 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8grtb\" (UniqueName: \"kubernetes.io/projected/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-kube-api-access-8grtb\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:13.920267 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.920249 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g488x\" (UniqueName: \"kubernetes.io/projected/3ccac1e5-a013-4728-8544-cd8df005a479-kube-api-access-g488x\") pod \"ingress-canary-r9mm9\" (UID: \"3ccac1e5-a013-4728-8544-cd8df005a479\") " pod="openshift-ingress-canary/ingress-canary-r9mm9" Apr 22 16:22:13.920551 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:13.920528 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-config-volume\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:14.413814 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:14.413777 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:14.413814 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:14.413824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert\") pod \"ingress-canary-r9mm9\" (UID: \"3ccac1e5-a013-4728-8544-cd8df005a479\") " pod="openshift-ingress-canary/ingress-canary-r9mm9" Apr 22 16:22:14.414117 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:14.413935 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:22:14.414117 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:14.414019 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls podName:e3238d38-c8d6-423c-bfa5-3feb9c21e8bc nodeName:}" failed. No retries permitted until 2026-04-22 16:22:15.413999171 +0000 UTC m=+35.116017274 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls") pod "dns-default-8zkfn" (UID: "e3238d38-c8d6-423c-bfa5-3feb9c21e8bc") : secret "dns-default-metrics-tls" not found Apr 22 16:22:14.414117 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:14.413940 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:22:14.414117 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:14.414096 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert podName:3ccac1e5-a013-4728-8544-cd8df005a479 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:15.414079737 +0000 UTC m=+35.116097831 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert") pod "ingress-canary-r9mm9" (UID: "3ccac1e5-a013-4728-8544-cd8df005a479") : secret "canary-serving-cert" not found Apr 22 16:22:14.514663 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:14.514627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs\") pod \"network-metrics-daemon-5wqw7\" (UID: \"09f37d35-30d1-4fc0-a88f-3514e6c16586\") " pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:22:14.514859 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:14.514784 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:22:14.514924 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:14.514879 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs podName:09f37d35-30d1-4fc0-a88f-3514e6c16586 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:46.514859014 +0000 UTC m=+66.216877103 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs") pod "network-metrics-daemon-5wqw7" (UID: "09f37d35-30d1-4fc0-a88f-3514e6c16586") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:22:14.615806 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:14.615771 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vxcb\" (UniqueName: \"kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb\") pod \"network-check-target-qf7zg\" (UID: \"8c01b2f6-2652-4b52-88bd-aa16905c79db\") " pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:22:14.616007 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:14.615985 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:22:14.616090 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:14.616009 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:22:14.616090 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:14.616024 2575 projected.go:194] Error preparing data for projected volume kube-api-access-2vxcb for pod openshift-network-diagnostics/network-check-target-qf7zg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:22:14.616170 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:14.616107 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb podName:8c01b2f6-2652-4b52-88bd-aa16905c79db nodeName:}" failed. No retries permitted until 2026-04-22 16:22:46.616087471 +0000 UTC m=+66.318105562 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-2vxcb" (UniqueName: "kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb") pod "network-check-target-qf7zg" (UID: "8c01b2f6-2652-4b52-88bd-aa16905c79db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:22:14.900206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:14.900171 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:22:14.900206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:14.900198 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:22:14.902995 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:14.902972 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 16:22:14.903937 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:14.903917 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 16:22:14.904110 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:14.904097 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 16:22:14.904275 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:14.904256 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dv4l7\"" Apr 22 16:22:14.904412 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:14.904399 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-phtvm\"" Apr 22 16:22:15.422199 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:15.422162 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:15.422449 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:15.422220 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert\") pod \"ingress-canary-r9mm9\" (UID: \"3ccac1e5-a013-4728-8544-cd8df005a479\") " pod="openshift-ingress-canary/ingress-canary-r9mm9" Apr 22 16:22:15.422449 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:15.422341 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:22:15.422449 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:15.422345 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:22:15.422449 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:15.422393 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert podName:3ccac1e5-a013-4728-8544-cd8df005a479 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:17.422380584 +0000 UTC m=+37.124398672 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert") pod "ingress-canary-r9mm9" (UID: "3ccac1e5-a013-4728-8544-cd8df005a479") : secret "canary-serving-cert" not found Apr 22 16:22:15.422449 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:15.422424 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls podName:e3238d38-c8d6-423c-bfa5-3feb9c21e8bc nodeName:}" failed. No retries permitted until 2026-04-22 16:22:17.422404269 +0000 UTC m=+37.124422360 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls") pod "dns-default-8zkfn" (UID: "e3238d38-c8d6-423c-bfa5-3feb9c21e8bc") : secret "dns-default-metrics-tls" not found Apr 22 16:22:17.440251 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:17.440076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:17.440591 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:17.440262 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert\") pod \"ingress-canary-r9mm9\" (UID: \"3ccac1e5-a013-4728-8544-cd8df005a479\") " pod="openshift-ingress-canary/ingress-canary-r9mm9" Apr 22 16:22:17.440591 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:17.440230 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:22:17.440591 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:17.440339 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:22:17.440591 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:17.440382 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls podName:e3238d38-c8d6-423c-bfa5-3feb9c21e8bc nodeName:}" failed. No retries permitted until 2026-04-22 16:22:21.440361863 +0000 UTC m=+41.142379950 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls") pod "dns-default-8zkfn" (UID: "e3238d38-c8d6-423c-bfa5-3feb9c21e8bc") : secret "dns-default-metrics-tls" not found Apr 22 16:22:17.440591 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:17.440399 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert podName:3ccac1e5-a013-4728-8544-cd8df005a479 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:21.440391895 +0000 UTC m=+41.142409982 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert") pod "ingress-canary-r9mm9" (UID: "3ccac1e5-a013-4728-8544-cd8df005a479") : secret "canary-serving-cert" not found Apr 22 16:22:18.038649 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:18.038568 2575 generic.go:358] "Generic (PLEG): container finished" podID="020a35c8-d40b-477c-8c6e-1530096b3f1a" containerID="185fbf48154421f7b0359b67e46b27466b9e6131300d96e02a413030130026cf" exitCode=0 Apr 22 16:22:18.038649 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:18.038617 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4gkbb" event={"ID":"020a35c8-d40b-477c-8c6e-1530096b3f1a","Type":"ContainerDied","Data":"185fbf48154421f7b0359b67e46b27466b9e6131300d96e02a413030130026cf"} Apr 22 16:22:19.043411 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:19.043380 2575 generic.go:358] "Generic (PLEG): container finished" podID="020a35c8-d40b-477c-8c6e-1530096b3f1a" containerID="7028f05135460fee332fdd83c6783f51cc64ac760f496fb9c460f9357e18d0e2" exitCode=0 Apr 22 16:22:19.043757 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:19.043422 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4gkbb" event={"ID":"020a35c8-d40b-477c-8c6e-1530096b3f1a","Type":"ContainerDied","Data":"7028f05135460fee332fdd83c6783f51cc64ac760f496fb9c460f9357e18d0e2"} Apr 22 16:22:20.048129 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:20.048099 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4gkbb" event={"ID":"020a35c8-d40b-477c-8c6e-1530096b3f1a","Type":"ContainerStarted","Data":"6e0d789388d2226c23792eaeb12f584bccfda3f8ed6e327294d03ab0f0016473"} Apr 22 16:22:20.070652 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:20.070605 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4gkbb" podStartSLOduration=6.053216126 podStartE2EDuration="40.07059086s" podCreationTimestamp="2026-04-22 16:21:40 +0000 UTC" firstStartedPulling="2026-04-22 16:21:43.58594077 +0000 UTC m=+3.287958871" lastFinishedPulling="2026-04-22 16:22:17.603315515 +0000 UTC m=+37.305333605" observedRunningTime="2026-04-22 16:22:20.069346711 +0000 UTC m=+39.771364819" watchObservedRunningTime="2026-04-22 16:22:20.07059086 +0000 UTC m=+39.772609004" Apr 22 16:22:21.471105 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:21.471032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:21.471527 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:21.471130 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert\") pod \"ingress-canary-r9mm9\" (UID: \"3ccac1e5-a013-4728-8544-cd8df005a479\") " pod="openshift-ingress-canary/ingress-canary-r9mm9" Apr 22 16:22:21.471527 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:21.471173 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:22:21.471527 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:21.471241 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls podName:e3238d38-c8d6-423c-bfa5-3feb9c21e8bc nodeName:}" failed. No retries permitted until 2026-04-22 16:22:29.471222598 +0000 UTC m=+49.173240685 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls") pod "dns-default-8zkfn" (UID: "e3238d38-c8d6-423c-bfa5-3feb9c21e8bc") : secret "dns-default-metrics-tls" not found Apr 22 16:22:21.471527 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:21.471242 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:22:21.471527 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:21.471295 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert podName:3ccac1e5-a013-4728-8544-cd8df005a479 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:29.47127872 +0000 UTC m=+49.173296807 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert") pod "ingress-canary-r9mm9" (UID: "3ccac1e5-a013-4728-8544-cd8df005a479") : secret "canary-serving-cert" not found Apr 22 16:22:29.523099 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:29.523032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:29.523548 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:29.523114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert\") pod \"ingress-canary-r9mm9\" (UID: \"3ccac1e5-a013-4728-8544-cd8df005a479\") " pod="openshift-ingress-canary/ingress-canary-r9mm9" Apr 22 16:22:29.523548 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:29.523156 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:22:29.523548 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:29.523200 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:22:29.523548 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:29.523223 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls podName:e3238d38-c8d6-423c-bfa5-3feb9c21e8bc nodeName:}" failed. No retries permitted until 2026-04-22 16:22:45.52320722 +0000 UTC m=+65.225225307 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls") pod "dns-default-8zkfn" (UID: "e3238d38-c8d6-423c-bfa5-3feb9c21e8bc") : secret "dns-default-metrics-tls" not found Apr 22 16:22:29.523548 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:29.523241 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert podName:3ccac1e5-a013-4728-8544-cd8df005a479 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:45.52322957 +0000 UTC m=+65.225247657 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert") pod "ingress-canary-r9mm9" (UID: "3ccac1e5-a013-4728-8544-cd8df005a479") : secret "canary-serving-cert" not found Apr 22 16:22:40.035187 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:40.035155 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxgm2" Apr 22 16:22:45.526972 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:45.526917 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:22:45.527466 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:45.526993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert\") pod \"ingress-canary-r9mm9\" (UID: \"3ccac1e5-a013-4728-8544-cd8df005a479\") " pod="openshift-ingress-canary/ingress-canary-r9mm9" Apr 22 16:22:45.527466 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:45.527071 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:22:45.527466 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:45.527128 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:22:45.527466 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:45.527143 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls podName:e3238d38-c8d6-423c-bfa5-3feb9c21e8bc nodeName:}" failed. No retries permitted until 2026-04-22 16:23:17.527126733 +0000 UTC m=+97.229144825 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls") pod "dns-default-8zkfn" (UID: "e3238d38-c8d6-423c-bfa5-3feb9c21e8bc") : secret "dns-default-metrics-tls" not found Apr 22 16:22:45.527466 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:45.527187 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert podName:3ccac1e5-a013-4728-8544-cd8df005a479 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:17.527174147 +0000 UTC m=+97.229192234 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert") pod "ingress-canary-r9mm9" (UID: "3ccac1e5-a013-4728-8544-cd8df005a479") : secret "canary-serving-cert" not found Apr 22 16:22:46.532329 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:46.532295 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs\") pod \"network-metrics-daemon-5wqw7\" (UID: \"09f37d35-30d1-4fc0-a88f-3514e6c16586\") " pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:22:46.535209 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:46.535194 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 16:22:46.543026 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:46.543009 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 16:22:46.543094 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:22:46.543085 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs podName:09f37d35-30d1-4fc0-a88f-3514e6c16586 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:50.543069213 +0000 UTC m=+130.245087300 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs") pod "network-metrics-daemon-5wqw7" (UID: "09f37d35-30d1-4fc0-a88f-3514e6c16586") : secret "metrics-daemon-secret" not found Apr 22 16:22:46.632628 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:46.632602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vxcb\" (UniqueName: \"kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb\") pod \"network-check-target-qf7zg\" (UID: \"8c01b2f6-2652-4b52-88bd-aa16905c79db\") " pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:22:46.634977 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:46.634960 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 16:22:46.644806 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:46.644788 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 16:22:46.656692 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:46.656672 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vxcb\" (UniqueName: \"kubernetes.io/projected/8c01b2f6-2652-4b52-88bd-aa16905c79db-kube-api-access-2vxcb\") pod \"network-check-target-qf7zg\" (UID: \"8c01b2f6-2652-4b52-88bd-aa16905c79db\") " pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:22:46.720553 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:46.720535 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-phtvm\"" Apr 22 16:22:46.728689 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:46.728675 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:22:46.848541 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:46.848511 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qf7zg"] Apr 22 16:22:46.851869 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:22:46.851846 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c01b2f6_2652_4b52_88bd_aa16905c79db.slice/crio-f75488ea0a4c05f1251e7bd3cc48141d5aaa3bc18b251213dc619320e0054ada WatchSource:0}: Error finding container f75488ea0a4c05f1251e7bd3cc48141d5aaa3bc18b251213dc619320e0054ada: Status 404 returned error can't find the container with id f75488ea0a4c05f1251e7bd3cc48141d5aaa3bc18b251213dc619320e0054ada Apr 22 16:22:47.096850 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:47.096769 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qf7zg" event={"ID":"8c01b2f6-2652-4b52-88bd-aa16905c79db","Type":"ContainerStarted","Data":"f75488ea0a4c05f1251e7bd3cc48141d5aaa3bc18b251213dc619320e0054ada"} Apr 22 16:22:50.103676 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:50.103641 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qf7zg" event={"ID":"8c01b2f6-2652-4b52-88bd-aa16905c79db","Type":"ContainerStarted","Data":"de9408c7a4077fad7ce8d8ea90a06d7ba6d6827cb8e39e78c7c9097bcba8a0d6"} Apr 22 16:22:50.104129 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:50.103750 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:22:50.119455 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:22:50.119394 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qf7zg" podStartSLOduration=66.542355185 podStartE2EDuration="1m9.119375946s" podCreationTimestamp="2026-04-22 16:21:41 +0000 UTC" firstStartedPulling="2026-04-22 16:22:46.854146666 +0000 UTC m=+66.556164763" lastFinishedPulling="2026-04-22 16:22:49.431167436 +0000 UTC m=+69.133185524" observedRunningTime="2026-04-22 16:22:50.118712584 +0000 UTC m=+69.820730693" watchObservedRunningTime="2026-04-22 16:22:50.119375946 +0000 UTC m=+69.821394066" Apr 22 16:23:17.626855 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:17.626804 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert\") pod \"ingress-canary-r9mm9\" (UID: \"3ccac1e5-a013-4728-8544-cd8df005a479\") " pod="openshift-ingress-canary/ingress-canary-r9mm9" Apr 22 16:23:17.627364 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:17.626882 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:23:17.627364 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:17.626950 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:23:17.627364 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:17.627011 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert podName:3ccac1e5-a013-4728-8544-cd8df005a479 nodeName:}" failed. No retries permitted until 2026-04-22 16:24:21.626993391 +0000 UTC m=+161.329011479 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert") pod "ingress-canary-r9mm9" (UID: "3ccac1e5-a013-4728-8544-cd8df005a479") : secret "canary-serving-cert" not found Apr 22 16:23:17.627364 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:17.626956 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:23:17.627364 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:17.627099 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls podName:e3238d38-c8d6-423c-bfa5-3feb9c21e8bc nodeName:}" failed. No retries permitted until 2026-04-22 16:24:21.627087501 +0000 UTC m=+161.329105589 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls") pod "dns-default-8zkfn" (UID: "e3238d38-c8d6-423c-bfa5-3feb9c21e8bc") : secret "dns-default-metrics-tls" not found Apr 22 16:23:21.108459 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:21.108430 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qf7zg" Apr 22 16:23:40.314818 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.314787 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fqmt7"] Apr 22 16:23:40.317840 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.317661 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fqmt7" Apr 22 16:23:40.318197 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.318062 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-6db85cc586-wptmg"] Apr 22 16:23:40.320176 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.320152 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:23:40.320257 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.320196 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-4dm58\"" Apr 22 16:23:40.320435 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.320423 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 16:23:40.320782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.320769 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-hrwl2"] Apr 22 16:23:40.320907 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.320894 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:40.322963 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.322945 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 16:23:40.323424 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.323396 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 16:23:40.323521 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.323449 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 16:23:40.323521 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.323495 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 16:23:40.323814 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.323795 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-j82f8\"" Apr 22 16:23:40.323907 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.323818 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 16:23:40.323907 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.323803 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 16:23:40.323907 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.323875 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.326329 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.326309 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 16:23:40.326633 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.326615 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 16:23:40.326713 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.326613 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 16:23:40.326713 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.326660 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-t87h9\"" Apr 22 16:23:40.327350 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.327224 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 16:23:40.327350 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.327323 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fqmt7"] Apr 22 16:23:40.332761 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.332738 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-hrwl2"] Apr 22 16:23:40.334204 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.334184 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6db85cc586-wptmg"] Apr 22 16:23:40.334505 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.334487 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 16:23:40.373450 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.373425 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnwrs\" (UniqueName: \"kubernetes.io/projected/7f802959-aff2-42a4-8382-362ea3582d6a-kube-api-access-rnwrs\") pod \"volume-data-source-validator-7c6cbb6c87-fqmt7\" (UID: \"7f802959-aff2-42a4-8382-362ea3582d6a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fqmt7" Apr 22 16:23:40.373547 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.373460 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47b372d5-7432-499c-b5f2-baff8c5f3689-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.373547 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.373494 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgrl2\" (UniqueName: \"kubernetes.io/projected/de8899ee-ffb9-447d-bfe6-3f4560af10d3-kube-api-access-tgrl2\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:40.373625 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.373551 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/47b372d5-7432-499c-b5f2-baff8c5f3689-snapshots\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.373625 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.373578 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47b372d5-7432-499c-b5f2-baff8c5f3689-tmp\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.373625 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.373605 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:40.373710 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.373625 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-stats-auth\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:40.373710 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.373686 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlprh\" (UniqueName: \"kubernetes.io/projected/47b372d5-7432-499c-b5f2-baff8c5f3689-kube-api-access-wlprh\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.373772 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.373714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:40.373772 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.373731 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47b372d5-7432-499c-b5f2-baff8c5f3689-service-ca-bundle\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.373772 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.373746 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-default-certificate\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:40.373872 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.373779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47b372d5-7432-499c-b5f2-baff8c5f3689-serving-cert\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.475087 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.475021 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47b372d5-7432-499c-b5f2-baff8c5f3689-tmp\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.475087 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.475088 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:40.475290 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.475109 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-stats-auth\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:40.475290 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.475146 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlprh\" (UniqueName: \"kubernetes.io/projected/47b372d5-7432-499c-b5f2-baff8c5f3689-kube-api-access-wlprh\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.475290 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.475165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:40.475290 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.475192 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47b372d5-7432-499c-b5f2-baff8c5f3689-service-ca-bundle\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.475290 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:40.475224 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle podName:de8899ee-ffb9-447d-bfe6-3f4560af10d3 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:40.975202136 +0000 UTC m=+120.677220223 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle") pod "router-default-6db85cc586-wptmg" (UID: "de8899ee-ffb9-447d-bfe6-3f4560af10d3") : configmap references non-existent config key: service-ca.crt Apr 22 16:23:40.475290 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.475256 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-default-certificate\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:40.475290 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.475289 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47b372d5-7432-499c-b5f2-baff8c5f3689-serving-cert\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.475618 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:40.475315 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 16:23:40.475618 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.475351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnwrs\" (UniqueName: \"kubernetes.io/projected/7f802959-aff2-42a4-8382-362ea3582d6a-kube-api-access-rnwrs\") pod \"volume-data-source-validator-7c6cbb6c87-fqmt7\" (UID: \"7f802959-aff2-42a4-8382-362ea3582d6a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fqmt7" Apr 22 16:23:40.475618 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:40.475392 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs podName:de8899ee-ffb9-447d-bfe6-3f4560af10d3 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:40.975374545 +0000 UTC m=+120.677392632 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs") pod "router-default-6db85cc586-wptmg" (UID: "de8899ee-ffb9-447d-bfe6-3f4560af10d3") : secret "router-metrics-certs-default" not found Apr 22 16:23:40.475618 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.475416 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47b372d5-7432-499c-b5f2-baff8c5f3689-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.475618 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.475446 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgrl2\" (UniqueName: \"kubernetes.io/projected/de8899ee-ffb9-447d-bfe6-3f4560af10d3-kube-api-access-tgrl2\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:40.475618 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.475499 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/47b372d5-7432-499c-b5f2-baff8c5f3689-snapshots\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.476314 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.476288 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47b372d5-7432-499c-b5f2-baff8c5f3689-tmp\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.476488 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.476466 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/47b372d5-7432-499c-b5f2-baff8c5f3689-snapshots\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.476569 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.476493 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47b372d5-7432-499c-b5f2-baff8c5f3689-service-ca-bundle\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.476790 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.476767 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47b372d5-7432-499c-b5f2-baff8c5f3689-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.477833 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.477805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47b372d5-7432-499c-b5f2-baff8c5f3689-serving-cert\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.477927 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.477879 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-stats-auth\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:40.477927 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.477896 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-default-certificate\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:40.483967 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.483910 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlprh\" (UniqueName: \"kubernetes.io/projected/47b372d5-7432-499c-b5f2-baff8c5f3689-kube-api-access-wlprh\") pod \"insights-operator-585dfdc468-hrwl2\" (UID: \"47b372d5-7432-499c-b5f2-baff8c5f3689\") " pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.484256 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.484235 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgrl2\" (UniqueName: \"kubernetes.io/projected/de8899ee-ffb9-447d-bfe6-3f4560af10d3-kube-api-access-tgrl2\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:40.485010 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.484992 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnwrs\" (UniqueName: \"kubernetes.io/projected/7f802959-aff2-42a4-8382-362ea3582d6a-kube-api-access-rnwrs\") pod \"volume-data-source-validator-7c6cbb6c87-fqmt7\" (UID: \"7f802959-aff2-42a4-8382-362ea3582d6a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fqmt7" Apr 22 16:23:40.628223 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.628140 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fqmt7" Apr 22 16:23:40.645151 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.645115 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-hrwl2" Apr 22 16:23:40.767921 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.767891 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fqmt7"] Apr 22 16:23:40.772015 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:23:40.771988 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f802959_aff2_42a4_8382_362ea3582d6a.slice/crio-504d74fce550a13e8b03f13f6c166607e8d2b3b2991679c5ebf60f98e3c17740 WatchSource:0}: Error finding container 504d74fce550a13e8b03f13f6c166607e8d2b3b2991679c5ebf60f98e3c17740: Status 404 returned error can't find the container with id 504d74fce550a13e8b03f13f6c166607e8d2b3b2991679c5ebf60f98e3c17740 Apr 22 16:23:40.783796 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.783772 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-hrwl2"] Apr 22 16:23:40.786624 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:23:40.786603 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47b372d5_7432_499c_b5f2_baff8c5f3689.slice/crio-1fbb43a56b97dd5fb792972036c6f5926e81d9b2279b496ddede9be855496f7b WatchSource:0}: Error finding container 1fbb43a56b97dd5fb792972036c6f5926e81d9b2279b496ddede9be855496f7b: Status 404 returned error can't find the container with id 1fbb43a56b97dd5fb792972036c6f5926e81d9b2279b496ddede9be855496f7b Apr 22 16:23:40.980416 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.980382 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:40.980573 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:40.980439 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:40.980573 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:40.980550 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 16:23:40.980573 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:40.980555 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle podName:de8899ee-ffb9-447d-bfe6-3f4560af10d3 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:41.980536218 +0000 UTC m=+121.682554306 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle") pod "router-default-6db85cc586-wptmg" (UID: "de8899ee-ffb9-447d-bfe6-3f4560af10d3") : configmap references non-existent config key: service-ca.crt Apr 22 16:23:40.980679 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:40.980593 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs podName:de8899ee-ffb9-447d-bfe6-3f4560af10d3 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:41.98057655 +0000 UTC m=+121.682594640 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs") pod "router-default-6db85cc586-wptmg" (UID: "de8899ee-ffb9-447d-bfe6-3f4560af10d3") : secret "router-metrics-certs-default" not found Apr 22 16:23:41.192926 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:41.192887 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hrwl2" event={"ID":"47b372d5-7432-499c-b5f2-baff8c5f3689","Type":"ContainerStarted","Data":"1fbb43a56b97dd5fb792972036c6f5926e81d9b2279b496ddede9be855496f7b"} Apr 22 16:23:41.193818 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:41.193797 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fqmt7" event={"ID":"7f802959-aff2-42a4-8382-362ea3582d6a","Type":"ContainerStarted","Data":"504d74fce550a13e8b03f13f6c166607e8d2b3b2991679c5ebf60f98e3c17740"} Apr 22 16:23:41.990137 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:41.990101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:41.990490 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:41.990200 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:41.990490 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:41.990244 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 16:23:41.990490 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:41.990325 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs podName:de8899ee-ffb9-447d-bfe6-3f4560af10d3 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:43.990306405 +0000 UTC m=+123.692324493 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs") pod "router-default-6db85cc586-wptmg" (UID: "de8899ee-ffb9-447d-bfe6-3f4560af10d3") : secret "router-metrics-certs-default" not found Apr 22 16:23:41.990490 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:41.990339 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle podName:de8899ee-ffb9-447d-bfe6-3f4560af10d3 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:43.99033356 +0000 UTC m=+123.692351647 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle") pod "router-default-6db85cc586-wptmg" (UID: "de8899ee-ffb9-447d-bfe6-3f4560af10d3") : configmap references non-existent config key: service-ca.crt Apr 22 16:23:44.005325 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.005289 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:44.005707 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.005363 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:44.005707 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:44.005444 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 16:23:44.005707 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:44.005477 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle podName:de8899ee-ffb9-447d-bfe6-3f4560af10d3 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:48.005464021 +0000 UTC m=+127.707482108 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle") pod "router-default-6db85cc586-wptmg" (UID: "de8899ee-ffb9-447d-bfe6-3f4560af10d3") : configmap references non-existent config key: service-ca.crt Apr 22 16:23:44.005707 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:44.005500 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs podName:de8899ee-ffb9-447d-bfe6-3f4560af10d3 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:48.005485692 +0000 UTC m=+127.707503779 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs") pod "router-default-6db85cc586-wptmg" (UID: "de8899ee-ffb9-447d-bfe6-3f4560af10d3") : secret "router-metrics-certs-default" not found Apr 22 16:23:44.200607 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.200572 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hrwl2" event={"ID":"47b372d5-7432-499c-b5f2-baff8c5f3689","Type":"ContainerStarted","Data":"222cca632b4e750650e6647a784e2a536f7ccb60b4d0a72b5133063791147338"} Apr 22 16:23:44.201825 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.201803 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fqmt7" event={"ID":"7f802959-aff2-42a4-8382-362ea3582d6a","Type":"ContainerStarted","Data":"017d0a7199c2527eea8e263c0e57e894102b8d6b73c25f4cb90fa3ce7e21ca5a"} Apr 22 16:23:44.217176 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.217128 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-hrwl2" podStartSLOduration=1.8503831370000001 podStartE2EDuration="4.217112728s" podCreationTimestamp="2026-04-22 16:23:40 +0000 UTC" firstStartedPulling="2026-04-22 16:23:40.788223728 +0000 UTC m=+120.490241815" lastFinishedPulling="2026-04-22 16:23:43.154953319 +0000 UTC m=+122.856971406" observedRunningTime="2026-04-22 16:23:44.216184011 +0000 UTC m=+123.918202120" watchObservedRunningTime="2026-04-22 16:23:44.217112728 +0000 UTC m=+123.919130839" Apr 22 16:23:44.230240 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.230192 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-fqmt7" podStartSLOduration=1.8529824289999999 podStartE2EDuration="4.230176311s" podCreationTimestamp="2026-04-22 16:23:40 +0000 UTC" firstStartedPulling="2026-04-22 16:23:40.773483969 +0000 UTC m=+120.475502057" lastFinishedPulling="2026-04-22 16:23:43.150677848 +0000 UTC m=+122.852695939" observedRunningTime="2026-04-22 16:23:44.228557826 +0000 UTC m=+123.930575937" watchObservedRunningTime="2026-04-22 16:23:44.230176311 +0000 UTC m=+123.932194414" Apr 22 16:23:44.255219 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.255188 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7"] Apr 22 16:23:44.258523 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.258483 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" Apr 22 16:23:44.260852 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.260828 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 16:23:44.260965 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.260883 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 16:23:44.260965 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.260899 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:23:44.261089 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.260984 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-wpp8p\"" Apr 22 16:23:44.267192 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.267170 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7"] Apr 22 16:23:44.307561 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.307532 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2m6w7\" (UID: \"2fce02cf-d44f-4c69-80d1-524c6b7fb205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" Apr 22 16:23:44.307679 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.307571 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22d48\" (UniqueName: \"kubernetes.io/projected/2fce02cf-d44f-4c69-80d1-524c6b7fb205-kube-api-access-22d48\") pod \"cluster-samples-operator-6dc5bdb6b4-2m6w7\" (UID: \"2fce02cf-d44f-4c69-80d1-524c6b7fb205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" Apr 22 16:23:44.408676 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.408650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2m6w7\" (UID: \"2fce02cf-d44f-4c69-80d1-524c6b7fb205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" Apr 22 16:23:44.408676 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.408679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22d48\" (UniqueName: \"kubernetes.io/projected/2fce02cf-d44f-4c69-80d1-524c6b7fb205-kube-api-access-22d48\") pod \"cluster-samples-operator-6dc5bdb6b4-2m6w7\" (UID: \"2fce02cf-d44f-4c69-80d1-524c6b7fb205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" Apr 22 16:23:44.408799 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:44.408785 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 16:23:44.408850 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:44.408843 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls podName:2fce02cf-d44f-4c69-80d1-524c6b7fb205 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:44.908828307 +0000 UTC m=+124.610846394 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2m6w7" (UID: "2fce02cf-d44f-4c69-80d1-524c6b7fb205") : secret "samples-operator-tls" not found Apr 22 16:23:44.419517 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.419492 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22d48\" (UniqueName: \"kubernetes.io/projected/2fce02cf-d44f-4c69-80d1-524c6b7fb205-kube-api-access-22d48\") pod \"cluster-samples-operator-6dc5bdb6b4-2m6w7\" (UID: \"2fce02cf-d44f-4c69-80d1-524c6b7fb205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" Apr 22 16:23:44.911691 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:44.911646 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2m6w7\" (UID: \"2fce02cf-d44f-4c69-80d1-524c6b7fb205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" Apr 22 16:23:44.911845 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:44.911776 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 16:23:44.911845 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:44.911832 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls podName:2fce02cf-d44f-4c69-80d1-524c6b7fb205 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:45.911816603 +0000 UTC m=+125.613834690 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2m6w7" (UID: "2fce02cf-d44f-4c69-80d1-524c6b7fb205") : secret "samples-operator-tls" not found Apr 22 16:23:45.918998 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:45.918960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2m6w7\" (UID: \"2fce02cf-d44f-4c69-80d1-524c6b7fb205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" Apr 22 16:23:45.919493 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:45.919115 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 16:23:45.919493 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:45.919179 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls podName:2fce02cf-d44f-4c69-80d1-524c6b7fb205 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:47.919163405 +0000 UTC m=+127.621181493 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2m6w7" (UID: "2fce02cf-d44f-4c69-80d1-524c6b7fb205") : secret "samples-operator-tls" not found Apr 22 16:23:46.138233 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:46.138204 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jg6wx_91576000-c254-43f5-84ba-7029c347da22/dns-node-resolver/0.log" Apr 22 16:23:46.738337 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:46.738306 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xkxjl_a929b972-02c0-4e8a-b302-09406b1c441c/node-ca/0.log" Apr 22 16:23:47.932661 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:47.932607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2m6w7\" (UID: \"2fce02cf-d44f-4c69-80d1-524c6b7fb205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" Apr 22 16:23:47.933090 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:47.932762 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 16:23:47.933090 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:47.932829 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls podName:2fce02cf-d44f-4c69-80d1-524c6b7fb205 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:51.932813804 +0000 UTC m=+131.634831895 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2m6w7" (UID: "2fce02cf-d44f-4c69-80d1-524c6b7fb205") : secret "samples-operator-tls" not found Apr 22 16:23:48.033760 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:48.033715 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:48.033928 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:48.033819 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:48.033928 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:48.033854 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 16:23:48.033928 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:48.033920 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs podName:de8899ee-ffb9-447d-bfe6-3f4560af10d3 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:56.033903548 +0000 UTC m=+135.735921649 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs") pod "router-default-6db85cc586-wptmg" (UID: "de8899ee-ffb9-447d-bfe6-3f4560af10d3") : secret "router-metrics-certs-default" not found Apr 22 16:23:48.034085 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:48.033955 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle podName:de8899ee-ffb9-447d-bfe6-3f4560af10d3 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:56.033939657 +0000 UTC m=+135.735957744 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle") pod "router-default-6db85cc586-wptmg" (UID: "de8899ee-ffb9-447d-bfe6-3f4560af10d3") : configmap references non-existent config key: service-ca.crt Apr 22 16:23:49.259497 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.259463 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5"] Apr 22 16:23:49.262481 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.262465 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5" Apr 22 16:23:49.264693 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.264664 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 16:23:49.264832 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.264807 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 16:23:49.264952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.264807 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-nfdjb\"" Apr 22 16:23:49.265634 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.265618 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 16:23:49.265714 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.265659 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:23:49.268530 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.268509 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5"] Apr 22 16:23:49.343923 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.343895 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b81523a0-77f7-4e1d-9f17-d99fd060b090-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jjjg5\" (UID: \"b81523a0-77f7-4e1d-9f17-d99fd060b090\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5" Apr 22 16:23:49.344100 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.343957 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hldn\" (UniqueName: \"kubernetes.io/projected/b81523a0-77f7-4e1d-9f17-d99fd060b090-kube-api-access-4hldn\") pod \"kube-storage-version-migrator-operator-6769c5d45-jjjg5\" (UID: \"b81523a0-77f7-4e1d-9f17-d99fd060b090\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5" Apr 22 16:23:49.344100 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.344028 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b81523a0-77f7-4e1d-9f17-d99fd060b090-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jjjg5\" (UID: \"b81523a0-77f7-4e1d-9f17-d99fd060b090\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5" Apr 22 16:23:49.444812 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.444777 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hldn\" (UniqueName: \"kubernetes.io/projected/b81523a0-77f7-4e1d-9f17-d99fd060b090-kube-api-access-4hldn\") pod \"kube-storage-version-migrator-operator-6769c5d45-jjjg5\" (UID: \"b81523a0-77f7-4e1d-9f17-d99fd060b090\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5" Apr 22 16:23:49.444972 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.444839 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b81523a0-77f7-4e1d-9f17-d99fd060b090-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jjjg5\" (UID: \"b81523a0-77f7-4e1d-9f17-d99fd060b090\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5" Apr 22 16:23:49.444972 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.444876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b81523a0-77f7-4e1d-9f17-d99fd060b090-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jjjg5\" (UID: \"b81523a0-77f7-4e1d-9f17-d99fd060b090\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5" Apr 22 16:23:49.445409 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.445384 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b81523a0-77f7-4e1d-9f17-d99fd060b090-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jjjg5\" (UID: \"b81523a0-77f7-4e1d-9f17-d99fd060b090\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5" Apr 22 16:23:49.447099 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.447080 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b81523a0-77f7-4e1d-9f17-d99fd060b090-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jjjg5\" (UID: \"b81523a0-77f7-4e1d-9f17-d99fd060b090\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5" Apr 22 16:23:49.452562 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.452537 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hldn\" (UniqueName: \"kubernetes.io/projected/b81523a0-77f7-4e1d-9f17-d99fd060b090-kube-api-access-4hldn\") pod \"kube-storage-version-migrator-operator-6769c5d45-jjjg5\" (UID: \"b81523a0-77f7-4e1d-9f17-d99fd060b090\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5" Apr 22 16:23:49.571379 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.571297 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5" Apr 22 16:23:49.679738 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:49.679706 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5"] Apr 22 16:23:49.683336 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:23:49.683302 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb81523a0_77f7_4e1d_9f17_d99fd060b090.slice/crio-052a084569c916b9e002228c2dae650312ea1a4be149823f5e1a931a53cdf16b WatchSource:0}: Error finding container 052a084569c916b9e002228c2dae650312ea1a4be149823f5e1a931a53cdf16b: Status 404 returned error can't find the container with id 052a084569c916b9e002228c2dae650312ea1a4be149823f5e1a931a53cdf16b Apr 22 16:23:50.081319 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.081290 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lcxgg"] Apr 22 16:23:50.085570 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.085555 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lcxgg" Apr 22 16:23:50.087791 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.087772 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-dm8k8\"" Apr 22 16:23:50.090332 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.090312 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lcxgg"] Apr 22 16:23:50.150816 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.150791 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4cf2\" (UniqueName: \"kubernetes.io/projected/2771db14-dc3f-4cef-bc3c-e4a4eebb6225-kube-api-access-q4cf2\") pod \"network-check-source-8894fc9bd-lcxgg\" (UID: \"2771db14-dc3f-4cef-bc3c-e4a4eebb6225\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lcxgg" Apr 22 16:23:50.212825 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.212791 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5" event={"ID":"b81523a0-77f7-4e1d-9f17-d99fd060b090","Type":"ContainerStarted","Data":"052a084569c916b9e002228c2dae650312ea1a4be149823f5e1a931a53cdf16b"} Apr 22 16:23:50.251409 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.251382 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4cf2\" (UniqueName: \"kubernetes.io/projected/2771db14-dc3f-4cef-bc3c-e4a4eebb6225-kube-api-access-q4cf2\") pod \"network-check-source-8894fc9bd-lcxgg\" (UID: \"2771db14-dc3f-4cef-bc3c-e4a4eebb6225\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lcxgg" Apr 22 16:23:50.261839 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.261817 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4cf2\" (UniqueName: \"kubernetes.io/projected/2771db14-dc3f-4cef-bc3c-e4a4eebb6225-kube-api-access-q4cf2\") pod \"network-check-source-8894fc9bd-lcxgg\" (UID: \"2771db14-dc3f-4cef-bc3c-e4a4eebb6225\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lcxgg" Apr 22 16:23:50.264176 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.264156 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr"] Apr 22 16:23:50.266848 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.266832 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr" Apr 22 16:23:50.269418 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.269397 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 16:23:50.269686 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.269620 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-c6rt6\"" Apr 22 16:23:50.269686 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.269650 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 16:23:50.269850 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.269712 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 16:23:50.269850 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.269734 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:23:50.276109 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.276080 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr"] Apr 22 16:23:50.351777 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.351718 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/656f9afe-2a94-4ce8-8322-98dccbf691a1-config\") pod \"service-ca-operator-d6fc45fc5-4n4pr\" (UID: \"656f9afe-2a94-4ce8-8322-98dccbf691a1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr" Apr 22 16:23:50.351777 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.351747 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz8mf\" (UniqueName: \"kubernetes.io/projected/656f9afe-2a94-4ce8-8322-98dccbf691a1-kube-api-access-fz8mf\") pod \"service-ca-operator-d6fc45fc5-4n4pr\" (UID: \"656f9afe-2a94-4ce8-8322-98dccbf691a1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr" Apr 22 16:23:50.351941 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.351838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/656f9afe-2a94-4ce8-8322-98dccbf691a1-serving-cert\") pod \"service-ca-operator-d6fc45fc5-4n4pr\" (UID: \"656f9afe-2a94-4ce8-8322-98dccbf691a1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr" Apr 22 16:23:50.394744 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.394714 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lcxgg" Apr 22 16:23:50.454385 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.452921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/656f9afe-2a94-4ce8-8322-98dccbf691a1-serving-cert\") pod \"service-ca-operator-d6fc45fc5-4n4pr\" (UID: \"656f9afe-2a94-4ce8-8322-98dccbf691a1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr" Apr 22 16:23:50.454385 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.452997 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/656f9afe-2a94-4ce8-8322-98dccbf691a1-config\") pod \"service-ca-operator-d6fc45fc5-4n4pr\" (UID: \"656f9afe-2a94-4ce8-8322-98dccbf691a1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr" Apr 22 16:23:50.454385 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.453025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fz8mf\" (UniqueName: \"kubernetes.io/projected/656f9afe-2a94-4ce8-8322-98dccbf691a1-kube-api-access-fz8mf\") pod \"service-ca-operator-d6fc45fc5-4n4pr\" (UID: \"656f9afe-2a94-4ce8-8322-98dccbf691a1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr" Apr 22 16:23:50.454385 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.454337 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/656f9afe-2a94-4ce8-8322-98dccbf691a1-config\") pod \"service-ca-operator-d6fc45fc5-4n4pr\" (UID: \"656f9afe-2a94-4ce8-8322-98dccbf691a1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr" Apr 22 16:23:50.459313 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.459257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/656f9afe-2a94-4ce8-8322-98dccbf691a1-serving-cert\") pod \"service-ca-operator-d6fc45fc5-4n4pr\" (UID: \"656f9afe-2a94-4ce8-8322-98dccbf691a1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr" Apr 22 16:23:50.461501 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.461476 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz8mf\" (UniqueName: \"kubernetes.io/projected/656f9afe-2a94-4ce8-8322-98dccbf691a1-kube-api-access-fz8mf\") pod \"service-ca-operator-d6fc45fc5-4n4pr\" (UID: \"656f9afe-2a94-4ce8-8322-98dccbf691a1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr" Apr 22 16:23:50.522193 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.522166 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lcxgg"] Apr 22 16:23:50.525307 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:23:50.525272 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2771db14_dc3f_4cef_bc3c_e4a4eebb6225.slice/crio-e600da2454babd48592d84c11891dd0f96de4b1577fa33c905c1196a8f37f8c6 WatchSource:0}: Error finding container e600da2454babd48592d84c11891dd0f96de4b1577fa33c905c1196a8f37f8c6: Status 404 returned error can't find the container with id e600da2454babd48592d84c11891dd0f96de4b1577fa33c905c1196a8f37f8c6 Apr 22 16:23:50.554016 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.553996 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs\") pod \"network-metrics-daemon-5wqw7\" (UID: \"09f37d35-30d1-4fc0-a88f-3514e6c16586\") " pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:23:50.554122 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:50.554108 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 16:23:50.554170 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:50.554161 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs podName:09f37d35-30d1-4fc0-a88f-3514e6c16586 nodeName:}" failed. No retries permitted until 2026-04-22 16:25:52.554146428 +0000 UTC m=+252.256164515 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs") pod "network-metrics-daemon-5wqw7" (UID: "09f37d35-30d1-4fc0-a88f-3514e6c16586") : secret "metrics-daemon-secret" not found Apr 22 16:23:50.574675 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.574623 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr" Apr 22 16:23:50.690705 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:50.690671 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr"] Apr 22 16:23:50.694642 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:23:50.694611 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod656f9afe_2a94_4ce8_8322_98dccbf691a1.slice/crio-316b649649101b91f51baa1eaaefc002f7aa0fd00873e9aeb92bb7b006830f6e WatchSource:0}: Error finding container 316b649649101b91f51baa1eaaefc002f7aa0fd00873e9aeb92bb7b006830f6e: Status 404 returned error can't find the container with id 316b649649101b91f51baa1eaaefc002f7aa0fd00873e9aeb92bb7b006830f6e Apr 22 16:23:51.215504 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:51.215461 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr" event={"ID":"656f9afe-2a94-4ce8-8322-98dccbf691a1","Type":"ContainerStarted","Data":"316b649649101b91f51baa1eaaefc002f7aa0fd00873e9aeb92bb7b006830f6e"} Apr 22 16:23:51.216667 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:51.216648 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lcxgg" event={"ID":"2771db14-dc3f-4cef-bc3c-e4a4eebb6225","Type":"ContainerStarted","Data":"24a8f8767b214ba5d391c67731ffbd37b36db2da97a7e65524d0be4311526ba7"} Apr 22 16:23:51.216747 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:51.216670 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lcxgg" event={"ID":"2771db14-dc3f-4cef-bc3c-e4a4eebb6225","Type":"ContainerStarted","Data":"e600da2454babd48592d84c11891dd0f96de4b1577fa33c905c1196a8f37f8c6"} Apr 22 16:23:51.231727 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:51.231686 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lcxgg" podStartSLOduration=1.231674106 podStartE2EDuration="1.231674106s" podCreationTimestamp="2026-04-22 16:23:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:23:51.2307531 +0000 UTC m=+130.932771208" watchObservedRunningTime="2026-04-22 16:23:51.231674106 +0000 UTC m=+130.933692214" Apr 22 16:23:51.968440 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:51.968394 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2m6w7\" (UID: \"2fce02cf-d44f-4c69-80d1-524c6b7fb205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" Apr 22 16:23:51.968863 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:51.968644 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 16:23:51.968863 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:51.968719 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls podName:2fce02cf-d44f-4c69-80d1-524c6b7fb205 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:59.968701438 +0000 UTC m=+139.670719535 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2m6w7" (UID: "2fce02cf-d44f-4c69-80d1-524c6b7fb205") : secret "samples-operator-tls" not found Apr 22 16:23:53.222888 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:53.222849 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5" event={"ID":"b81523a0-77f7-4e1d-9f17-d99fd060b090","Type":"ContainerStarted","Data":"02d2d52b7362666fe89ec5c3bf2f79f572916fd3be4a0c4603ba3f1ac8f17e3d"} Apr 22 16:23:53.239684 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:53.239624 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5" podStartSLOduration=1.50308967 podStartE2EDuration="4.239605157s" podCreationTimestamp="2026-04-22 16:23:49 +0000 UTC" firstStartedPulling="2026-04-22 16:23:49.68505933 +0000 UTC m=+129.387077417" lastFinishedPulling="2026-04-22 16:23:52.421574804 +0000 UTC m=+132.123592904" observedRunningTime="2026-04-22 16:23:53.237534852 +0000 UTC m=+132.939552961" watchObservedRunningTime="2026-04-22 16:23:53.239605157 +0000 UTC m=+132.941623267" Apr 22 16:23:54.226830 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:54.226797 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr" event={"ID":"656f9afe-2a94-4ce8-8322-98dccbf691a1","Type":"ContainerStarted","Data":"b5c12b3ad9df14b61020edef0b1953757fbff477b94b782ce990d9a29c5e57ee"} Apr 22 16:23:54.242783 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:54.242735 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr" podStartSLOduration=1.431061553 podStartE2EDuration="4.242718353s" podCreationTimestamp="2026-04-22 16:23:50 +0000 UTC" firstStartedPulling="2026-04-22 16:23:50.696801737 +0000 UTC m=+130.398819826" lastFinishedPulling="2026-04-22 16:23:53.508458538 +0000 UTC m=+133.210476626" observedRunningTime="2026-04-22 16:23:54.242371247 +0000 UTC m=+133.944389356" watchObservedRunningTime="2026-04-22 16:23:54.242718353 +0000 UTC m=+133.944736462" Apr 22 16:23:56.103978 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:56.103938 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:56.104354 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:23:56.104005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:23:56.104354 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:56.104124 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 16:23:56.104354 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:56.104127 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle podName:de8899ee-ffb9-447d-bfe6-3f4560af10d3 nodeName:}" failed. No retries permitted until 2026-04-22 16:24:12.104107279 +0000 UTC m=+151.806125367 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle") pod "router-default-6db85cc586-wptmg" (UID: "de8899ee-ffb9-447d-bfe6-3f4560af10d3") : configmap references non-existent config key: service-ca.crt Apr 22 16:23:56.104354 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:23:56.104180 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs podName:de8899ee-ffb9-447d-bfe6-3f4560af10d3 nodeName:}" failed. No retries permitted until 2026-04-22 16:24:12.104169948 +0000 UTC m=+151.806188035 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs") pod "router-default-6db85cc586-wptmg" (UID: "de8899ee-ffb9-447d-bfe6-3f4560af10d3") : secret "router-metrics-certs-default" not found Apr 22 16:24:00.038214 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:00.038164 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2m6w7\" (UID: \"2fce02cf-d44f-4c69-80d1-524c6b7fb205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" Apr 22 16:24:00.038580 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:24:00.038312 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 16:24:00.038580 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:24:00.038383 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls podName:2fce02cf-d44f-4c69-80d1-524c6b7fb205 nodeName:}" failed. No retries permitted until 2026-04-22 16:24:16.038362883 +0000 UTC m=+155.740380971 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2m6w7" (UID: "2fce02cf-d44f-4c69-80d1-524c6b7fb205") : secret "samples-operator-tls" not found Apr 22 16:24:12.128284 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:12.128241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:24:12.128735 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:12.128315 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:24:12.128810 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:12.128789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8899ee-ffb9-447d-bfe6-3f4560af10d3-service-ca-bundle\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:24:12.130746 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:12.130722 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8899ee-ffb9-447d-bfe6-3f4560af10d3-metrics-certs\") pod \"router-default-6db85cc586-wptmg\" (UID: \"de8899ee-ffb9-447d-bfe6-3f4560af10d3\") " pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:24:12.142677 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:12.142657 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-j82f8\"" Apr 22 16:24:12.150643 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:12.150629 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:24:12.271867 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:12.271839 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6db85cc586-wptmg"] Apr 22 16:24:12.275291 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:24:12.275264 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde8899ee_ffb9_447d_bfe6_3f4560af10d3.slice/crio-63713974df02513630ac9f178fa1d32a03f673a087ef4220f4b8a8673ff30bf6 WatchSource:0}: Error finding container 63713974df02513630ac9f178fa1d32a03f673a087ef4220f4b8a8673ff30bf6: Status 404 returned error can't find the container with id 63713974df02513630ac9f178fa1d32a03f673a087ef4220f4b8a8673ff30bf6 Apr 22 16:24:13.275094 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:13.275059 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6db85cc586-wptmg" event={"ID":"de8899ee-ffb9-447d-bfe6-3f4560af10d3","Type":"ContainerStarted","Data":"686d0e5e56011c7428723658bc5272d0526ee258fbed06d9029421b0d42eea3e"} Apr 22 16:24:13.275094 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:13.275097 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6db85cc586-wptmg" event={"ID":"de8899ee-ffb9-447d-bfe6-3f4560af10d3","Type":"ContainerStarted","Data":"63713974df02513630ac9f178fa1d32a03f673a087ef4220f4b8a8673ff30bf6"} Apr 22 16:24:13.297961 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:13.297913 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-6db85cc586-wptmg" podStartSLOduration=33.297900453 podStartE2EDuration="33.297900453s" podCreationTimestamp="2026-04-22 16:23:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:24:13.296546364 +0000 UTC m=+152.998564472" watchObservedRunningTime="2026-04-22 16:24:13.297900453 +0000 UTC m=+152.999918593" Apr 22 16:24:14.150848 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:14.150809 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:24:14.153284 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:14.153261 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:24:14.277098 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:14.277076 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:24:14.278145 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:14.278123 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-6db85cc586-wptmg" Apr 22 16:24:15.990778 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:15.990752 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-659c89dd9c-7zqf4"] Apr 22 16:24:15.993853 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:15.993832 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:15.996540 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:15.996511 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 16:24:15.996646 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:15.996551 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 16:24:15.997530 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:15.997509 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-r8rzs\"" Apr 22 16:24:15.997638 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:15.997558 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 16:24:16.002395 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.002376 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 16:24:16.011143 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.011111 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-659c89dd9c-7zqf4"] Apr 22 16:24:16.059117 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.059093 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ada8f77b-6c93-4914-a78d-b753c44deb3e-registry-tls\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.059226 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.059132 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ada8f77b-6c93-4914-a78d-b753c44deb3e-installation-pull-secrets\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.059226 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.059152 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ada8f77b-6c93-4914-a78d-b753c44deb3e-bound-sa-token\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.059309 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.059288 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ada8f77b-6c93-4914-a78d-b753c44deb3e-image-registry-private-configuration\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.059343 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.059330 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ada8f77b-6c93-4914-a78d-b753c44deb3e-trusted-ca\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.059374 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.059359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ada8f77b-6c93-4914-a78d-b753c44deb3e-ca-trust-extracted\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.059421 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.059387 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fktvs\" (UniqueName: \"kubernetes.io/projected/ada8f77b-6c93-4914-a78d-b753c44deb3e-kube-api-access-fktvs\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.059421 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.059410 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ada8f77b-6c93-4914-a78d-b753c44deb3e-registry-certificates\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.059492 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.059430 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2m6w7\" (UID: \"2fce02cf-d44f-4c69-80d1-524c6b7fb205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" Apr 22 16:24:16.059847 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.059828 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z6p99"] Apr 22 16:24:16.062130 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.062112 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fce02cf-d44f-4c69-80d1-524c6b7fb205-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2m6w7\" (UID: \"2fce02cf-d44f-4c69-80d1-524c6b7fb205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" Apr 22 16:24:16.062545 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.062534 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z6p99" Apr 22 16:24:16.064927 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.064911 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 16:24:16.065669 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.065648 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-ljgq9\"" Apr 22 16:24:16.069311 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.069296 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" Apr 22 16:24:16.076232 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.075727 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z6p99"] Apr 22 16:24:16.076793 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.076774 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-grrr6"] Apr 22 16:24:16.080396 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.080375 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-grrr6" Apr 22 16:24:16.082632 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.082614 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-llksk\"" Apr 22 16:24:16.083492 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.083475 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 16:24:16.084298 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.084282 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 16:24:16.092209 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.092072 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-grrr6"] Apr 22 16:24:16.159764 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.159731 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ada8f77b-6c93-4914-a78d-b753c44deb3e-trusted-ca\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.159764 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.159769 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ada8f77b-6c93-4914-a78d-b753c44deb3e-ca-trust-extracted\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.159968 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.159798 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fktvs\" (UniqueName: \"kubernetes.io/projected/ada8f77b-6c93-4914-a78d-b753c44deb3e-kube-api-access-fktvs\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.159968 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.159852 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/af39f682-9ae9-4abb-ad48-ca3370263b6b-crio-socket\") pod \"insights-runtime-extractor-grrr6\" (UID: \"af39f682-9ae9-4abb-ad48-ca3370263b6b\") " pod="openshift-insights/insights-runtime-extractor-grrr6" Apr 22 16:24:16.159968 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.159876 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/af39f682-9ae9-4abb-ad48-ca3370263b6b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-grrr6\" (UID: \"af39f682-9ae9-4abb-ad48-ca3370263b6b\") " pod="openshift-insights/insights-runtime-extractor-grrr6" Apr 22 16:24:16.159968 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.159901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbxf9\" (UniqueName: \"kubernetes.io/projected/af39f682-9ae9-4abb-ad48-ca3370263b6b-kube-api-access-qbxf9\") pod \"insights-runtime-extractor-grrr6\" (UID: \"af39f682-9ae9-4abb-ad48-ca3370263b6b\") " pod="openshift-insights/insights-runtime-extractor-grrr6" Apr 22 16:24:16.159968 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.159925 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ada8f77b-6c93-4914-a78d-b753c44deb3e-registry-certificates\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.159968 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.159961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ada8f77b-6c93-4914-a78d-b753c44deb3e-registry-tls\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.160282 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.160005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ada8f77b-6c93-4914-a78d-b753c44deb3e-installation-pull-secrets\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.160282 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.160056 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/dd347c8d-dbe5-4f7e-90f0-0fb9180c29c4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-z6p99\" (UID: \"dd347c8d-dbe5-4f7e-90f0-0fb9180c29c4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z6p99" Apr 22 16:24:16.160282 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.160081 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/af39f682-9ae9-4abb-ad48-ca3370263b6b-data-volume\") pod \"insights-runtime-extractor-grrr6\" (UID: \"af39f682-9ae9-4abb-ad48-ca3370263b6b\") " pod="openshift-insights/insights-runtime-extractor-grrr6" Apr 22 16:24:16.160282 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.160114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ada8f77b-6c93-4914-a78d-b753c44deb3e-bound-sa-token\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.160282 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.160176 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/af39f682-9ae9-4abb-ad48-ca3370263b6b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-grrr6\" (UID: \"af39f682-9ae9-4abb-ad48-ca3370263b6b\") " pod="openshift-insights/insights-runtime-extractor-grrr6" Apr 22 16:24:16.160282 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.160200 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ada8f77b-6c93-4914-a78d-b753c44deb3e-ca-trust-extracted\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.160282 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.160216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ada8f77b-6c93-4914-a78d-b753c44deb3e-image-registry-private-configuration\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.161602 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.161575 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ada8f77b-6c93-4914-a78d-b753c44deb3e-registry-certificates\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.168615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.163948 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ada8f77b-6c93-4914-a78d-b753c44deb3e-image-registry-private-configuration\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.168615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.164217 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ada8f77b-6c93-4914-a78d-b753c44deb3e-trusted-ca\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.168615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.164363 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ada8f77b-6c93-4914-a78d-b753c44deb3e-registry-tls\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.168615 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.164363 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ada8f77b-6c93-4914-a78d-b753c44deb3e-installation-pull-secrets\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.173721 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.173699 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fktvs\" (UniqueName: \"kubernetes.io/projected/ada8f77b-6c93-4914-a78d-b753c44deb3e-kube-api-access-fktvs\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.174308 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.174288 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ada8f77b-6c93-4914-a78d-b753c44deb3e-bound-sa-token\") pod \"image-registry-659c89dd9c-7zqf4\" (UID: \"ada8f77b-6c93-4914-a78d-b753c44deb3e\") " pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.198097 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.198078 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7"] Apr 22 16:24:16.260528 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.260509 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/af39f682-9ae9-4abb-ad48-ca3370263b6b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-grrr6\" (UID: \"af39f682-9ae9-4abb-ad48-ca3370263b6b\") " pod="openshift-insights/insights-runtime-extractor-grrr6" Apr 22 16:24:16.260614 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.260558 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/af39f682-9ae9-4abb-ad48-ca3370263b6b-crio-socket\") pod \"insights-runtime-extractor-grrr6\" (UID: \"af39f682-9ae9-4abb-ad48-ca3370263b6b\") " pod="openshift-insights/insights-runtime-extractor-grrr6" Apr 22 16:24:16.260614 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.260574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/af39f682-9ae9-4abb-ad48-ca3370263b6b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-grrr6\" (UID: \"af39f682-9ae9-4abb-ad48-ca3370263b6b\") " pod="openshift-insights/insights-runtime-extractor-grrr6" Apr 22 16:24:16.260614 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.260592 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbxf9\" (UniqueName: \"kubernetes.io/projected/af39f682-9ae9-4abb-ad48-ca3370263b6b-kube-api-access-qbxf9\") pod \"insights-runtime-extractor-grrr6\" (UID: \"af39f682-9ae9-4abb-ad48-ca3370263b6b\") " pod="openshift-insights/insights-runtime-extractor-grrr6" Apr 22 16:24:16.260720 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.260638 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/af39f682-9ae9-4abb-ad48-ca3370263b6b-crio-socket\") pod \"insights-runtime-extractor-grrr6\" (UID: \"af39f682-9ae9-4abb-ad48-ca3370263b6b\") " pod="openshift-insights/insights-runtime-extractor-grrr6" Apr 22 16:24:16.260720 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.260697 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/dd347c8d-dbe5-4f7e-90f0-0fb9180c29c4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-z6p99\" (UID: \"dd347c8d-dbe5-4f7e-90f0-0fb9180c29c4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z6p99" Apr 22 16:24:16.260720 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.260714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/af39f682-9ae9-4abb-ad48-ca3370263b6b-data-volume\") pod \"insights-runtime-extractor-grrr6\" (UID: \"af39f682-9ae9-4abb-ad48-ca3370263b6b\") " pod="openshift-insights/insights-runtime-extractor-grrr6" Apr 22 16:24:16.260954 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.260939 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/af39f682-9ae9-4abb-ad48-ca3370263b6b-data-volume\") pod \"insights-runtime-extractor-grrr6\" (UID: \"af39f682-9ae9-4abb-ad48-ca3370263b6b\") " pod="openshift-insights/insights-runtime-extractor-grrr6" Apr 22 16:24:16.261054 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.261026 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/af39f682-9ae9-4abb-ad48-ca3370263b6b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-grrr6\" (UID: \"af39f682-9ae9-4abb-ad48-ca3370263b6b\") " pod="openshift-insights/insights-runtime-extractor-grrr6" Apr 22 16:24:16.262651 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.262627 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/af39f682-9ae9-4abb-ad48-ca3370263b6b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-grrr6\" (UID: \"af39f682-9ae9-4abb-ad48-ca3370263b6b\") " pod="openshift-insights/insights-runtime-extractor-grrr6" Apr 22 16:24:16.262956 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.262940 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/dd347c8d-dbe5-4f7e-90f0-0fb9180c29c4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-z6p99\" (UID: \"dd347c8d-dbe5-4f7e-90f0-0fb9180c29c4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z6p99" Apr 22 16:24:16.281486 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.281456 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" event={"ID":"2fce02cf-d44f-4c69-80d1-524c6b7fb205","Type":"ContainerStarted","Data":"4a02d4241fc289e45d1dfdf83a7f2f9794f2c0f4ca72aebb7227b39efa49134f"} Apr 22 16:24:16.283392 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.283364 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbxf9\" (UniqueName: \"kubernetes.io/projected/af39f682-9ae9-4abb-ad48-ca3370263b6b-kube-api-access-qbxf9\") pod \"insights-runtime-extractor-grrr6\" (UID: \"af39f682-9ae9-4abb-ad48-ca3370263b6b\") " pod="openshift-insights/insights-runtime-extractor-grrr6" Apr 22 16:24:16.304373 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.304352 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:16.376362 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.376338 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z6p99" Apr 22 16:24:16.395792 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.395763 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-grrr6" Apr 22 16:24:16.421317 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.421286 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-659c89dd9c-7zqf4"] Apr 22 16:24:16.428171 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:24:16.428137 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podada8f77b_6c93_4914_a78d_b753c44deb3e.slice/crio-92f7d1070ee2fde5119348ab2bf625e4011aa33352a6eace7e44c82b82006c0f WatchSource:0}: Error finding container 92f7d1070ee2fde5119348ab2bf625e4011aa33352a6eace7e44c82b82006c0f: Status 404 returned error can't find the container with id 92f7d1070ee2fde5119348ab2bf625e4011aa33352a6eace7e44c82b82006c0f Apr 22 16:24:16.515403 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.515374 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z6p99"] Apr 22 16:24:16.518399 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:24:16.518372 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd347c8d_dbe5_4f7e_90f0_0fb9180c29c4.slice/crio-3ee8bbd8416fc18514842c6049d7074352dc32c11880a0bd1d6a8b55dd9b8517 WatchSource:0}: Error finding container 3ee8bbd8416fc18514842c6049d7074352dc32c11880a0bd1d6a8b55dd9b8517: Status 404 returned error can't find the container with id 3ee8bbd8416fc18514842c6049d7074352dc32c11880a0bd1d6a8b55dd9b8517 Apr 22 16:24:16.531339 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:16.531314 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-grrr6"] Apr 22 16:24:16.535101 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:24:16.535077 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf39f682_9ae9_4abb_ad48_ca3370263b6b.slice/crio-ba42e6587824098a96c277d5d01b0380b607e128dc874f4e5d7aa4d75dff1923 WatchSource:0}: Error finding container ba42e6587824098a96c277d5d01b0380b607e128dc874f4e5d7aa4d75dff1923: Status 404 returned error can't find the container with id ba42e6587824098a96c277d5d01b0380b607e128dc874f4e5d7aa4d75dff1923 Apr 22 16:24:16.716128 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:24:16.716089 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-8zkfn" podUID="e3238d38-c8d6-423c-bfa5-3feb9c21e8bc" Apr 22 16:24:16.738479 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:24:16.738403 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-r9mm9" podUID="3ccac1e5-a013-4728-8544-cd8df005a479" Apr 22 16:24:17.286231 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:17.286196 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z6p99" event={"ID":"dd347c8d-dbe5-4f7e-90f0-0fb9180c29c4","Type":"ContainerStarted","Data":"3ee8bbd8416fc18514842c6049d7074352dc32c11880a0bd1d6a8b55dd9b8517"} Apr 22 16:24:17.287713 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:17.287685 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-grrr6" event={"ID":"af39f682-9ae9-4abb-ad48-ca3370263b6b","Type":"ContainerStarted","Data":"a248abd5d80ebdd8c64bc4aa28bd5e02a9d43b6d784dc039b09a4977894bcfae"} Apr 22 16:24:17.287713 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:17.287720 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-grrr6" event={"ID":"af39f682-9ae9-4abb-ad48-ca3370263b6b","Type":"ContainerStarted","Data":"ba42e6587824098a96c277d5d01b0380b607e128dc874f4e5d7aa4d75dff1923"} Apr 22 16:24:17.289114 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:17.289087 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" event={"ID":"ada8f77b-6c93-4914-a78d-b753c44deb3e","Type":"ContainerStarted","Data":"92b46fbb0aac2412d3c7c5ee9245b09d57cb6e6be023d674426350df7b2be33f"} Apr 22 16:24:17.289114 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:17.289110 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8zkfn" Apr 22 16:24:17.289266 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:17.289120 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" event={"ID":"ada8f77b-6c93-4914-a78d-b753c44deb3e","Type":"ContainerStarted","Data":"92f7d1070ee2fde5119348ab2bf625e4011aa33352a6eace7e44c82b82006c0f"} Apr 22 16:24:17.289306 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:17.289291 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:17.311168 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:17.311118 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" podStartSLOduration=2.311101628 podStartE2EDuration="2.311101628s" podCreationTimestamp="2026-04-22 16:24:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:24:17.309245023 +0000 UTC m=+157.011263136" watchObservedRunningTime="2026-04-22 16:24:17.311101628 +0000 UTC m=+157.013119738" Apr 22 16:24:17.912728 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:24:17.912674 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-5wqw7" podUID="09f37d35-30d1-4fc0-a88f-3514e6c16586" Apr 22 16:24:18.293336 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:18.293300 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-grrr6" event={"ID":"af39f682-9ae9-4abb-ad48-ca3370263b6b","Type":"ContainerStarted","Data":"709419cfdeba70c9cf6e8938cc44307f33b631bfc6cf8fc2961bc2029ede8a5a"} Apr 22 16:24:18.294531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:18.294503 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z6p99" event={"ID":"dd347c8d-dbe5-4f7e-90f0-0fb9180c29c4","Type":"ContainerStarted","Data":"5ea95445c3275c5afcd6d5ccde50913ff40f95476374d38fc16deef64ad21d7e"} Apr 22 16:24:18.312564 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:18.312522 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z6p99" podStartSLOduration=1.045844926 podStartE2EDuration="2.312509793s" podCreationTimestamp="2026-04-22 16:24:16 +0000 UTC" firstStartedPulling="2026-04-22 16:24:16.520422711 +0000 UTC m=+156.222440802" lastFinishedPulling="2026-04-22 16:24:17.787087578 +0000 UTC m=+157.489105669" observedRunningTime="2026-04-22 16:24:18.311588963 +0000 UTC m=+158.013607073" watchObservedRunningTime="2026-04-22 16:24:18.312509793 +0000 UTC m=+158.014527901" Apr 22 16:24:19.298107 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:19.298070 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" event={"ID":"2fce02cf-d44f-4c69-80d1-524c6b7fb205","Type":"ContainerStarted","Data":"b0f2b7ea1251721914429e8cb3e3de5d3223ab060a67f6630641cd7dfd3bd7bf"} Apr 22 16:24:19.298482 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:19.298112 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" event={"ID":"2fce02cf-d44f-4c69-80d1-524c6b7fb205","Type":"ContainerStarted","Data":"675da91e7b28facb6b8733ae553177668b7e8a9a11d256cab1d60efc58b2cc14"} Apr 22 16:24:19.299682 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:19.299656 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-grrr6" event={"ID":"af39f682-9ae9-4abb-ad48-ca3370263b6b","Type":"ContainerStarted","Data":"39e9ea73edf9d220e1094b2b142ac5532ac8abeea8aa390c6086bf4d77db7601"} Apr 22 16:24:19.299856 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:19.299839 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z6p99" Apr 22 16:24:19.304444 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:19.304425 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-z6p99" Apr 22 16:24:19.315871 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:19.315831 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2m6w7" podStartSLOduration=33.199201268 podStartE2EDuration="35.315819537s" podCreationTimestamp="2026-04-22 16:23:44 +0000 UTC" firstStartedPulling="2026-04-22 16:24:16.237892998 +0000 UTC m=+155.939911085" lastFinishedPulling="2026-04-22 16:24:18.354511266 +0000 UTC m=+158.056529354" observedRunningTime="2026-04-22 16:24:19.314937163 +0000 UTC m=+159.016955271" watchObservedRunningTime="2026-04-22 16:24:19.315819537 +0000 UTC m=+159.017837628" Apr 22 16:24:19.331998 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:19.331962 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-grrr6" podStartSLOduration=0.852093554 podStartE2EDuration="3.33195202s" podCreationTimestamp="2026-04-22 16:24:16 +0000 UTC" firstStartedPulling="2026-04-22 16:24:16.625529974 +0000 UTC m=+156.327548065" lastFinishedPulling="2026-04-22 16:24:19.105388426 +0000 UTC m=+158.807406531" observedRunningTime="2026-04-22 16:24:19.331087439 +0000 UTC m=+159.033105548" watchObservedRunningTime="2026-04-22 16:24:19.33195202 +0000 UTC m=+159.033970128" Apr 22 16:24:21.707411 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:21.707370 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert\") pod \"ingress-canary-r9mm9\" (UID: \"3ccac1e5-a013-4728-8544-cd8df005a479\") " pod="openshift-ingress-canary/ingress-canary-r9mm9" Apr 22 16:24:21.707786 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:21.707440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:24:21.709904 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:21.709877 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3238d38-c8d6-423c-bfa5-3feb9c21e8bc-metrics-tls\") pod \"dns-default-8zkfn\" (UID: \"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc\") " pod="openshift-dns/dns-default-8zkfn" Apr 22 16:24:21.709996 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:21.709940 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ccac1e5-a013-4728-8544-cd8df005a479-cert\") pod \"ingress-canary-r9mm9\" (UID: \"3ccac1e5-a013-4728-8544-cd8df005a479\") " pod="openshift-ingress-canary/ingress-canary-r9mm9" Apr 22 16:24:21.794010 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:21.793983 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fsrq7\"" Apr 22 16:24:21.800496 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:21.800475 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8zkfn" Apr 22 16:24:21.918493 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:21.918457 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8zkfn"] Apr 22 16:24:21.922069 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:24:21.922016 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3238d38_c8d6_423c_bfa5_3feb9c21e8bc.slice/crio-b21faf3322c88abfc5ea5a90db79aae7ce17ced015a34fcb32950da0166b4dec WatchSource:0}: Error finding container b21faf3322c88abfc5ea5a90db79aae7ce17ced015a34fcb32950da0166b4dec: Status 404 returned error can't find the container with id b21faf3322c88abfc5ea5a90db79aae7ce17ced015a34fcb32950da0166b4dec Apr 22 16:24:22.308354 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:22.308321 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8zkfn" event={"ID":"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc","Type":"ContainerStarted","Data":"b21faf3322c88abfc5ea5a90db79aae7ce17ced015a34fcb32950da0166b4dec"} Apr 22 16:24:24.317337 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:24.317304 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8zkfn" event={"ID":"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc","Type":"ContainerStarted","Data":"2920fc528f09bdb31b4aa62c04e9a0ab829bb11adcc03bbe01ad31948bfcd2ec"} Apr 22 16:24:24.317337 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:24.317337 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8zkfn" event={"ID":"e3238d38-c8d6-423c-bfa5-3feb9c21e8bc","Type":"ContainerStarted","Data":"bd2fe19d5ef9c74c03c124ed36330ce773d563be720b57420e44f4e3e638aaab"} Apr 22 16:24:24.317757 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:24.317451 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8zkfn" Apr 22 16:24:24.338355 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:24.338305 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8zkfn" podStartSLOduration=129.432256574 podStartE2EDuration="2m11.338292183s" podCreationTimestamp="2026-04-22 16:22:13 +0000 UTC" firstStartedPulling="2026-04-22 16:24:21.923718885 +0000 UTC m=+161.625736976" lastFinishedPulling="2026-04-22 16:24:23.829754492 +0000 UTC m=+163.531772585" observedRunningTime="2026-04-22 16:24:24.3369173 +0000 UTC m=+164.038935418" watchObservedRunningTime="2026-04-22 16:24:24.338292183 +0000 UTC m=+164.040310291" Apr 22 16:24:25.486109 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.486077 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zcsck"] Apr 22 16:24:25.489446 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.489418 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.492356 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.492336 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 16:24:25.492487 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.492439 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-q77pt\"" Apr 22 16:24:25.493230 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.493208 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 16:24:25.493399 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.493376 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 16:24:25.493399 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.493221 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 16:24:25.493570 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.493208 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 16:24:25.493570 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.493560 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 16:24:25.527691 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.527666 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9mtcp"] Apr 22 16:24:25.530716 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.530698 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.532930 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.532910 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-accelerators-collector-config\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.533032 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.532939 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-tls\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.533032 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.532960 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgt9g\" (UniqueName: \"kubernetes.io/projected/2d29fc65-9864-4e36-bfb7-97c820068952-kube-api-access-pgt9g\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.533032 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.532984 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-wtmp\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.533173 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.533077 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d29fc65-9864-4e36-bfb7-97c820068952-metrics-client-ca\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.533173 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.533146 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-textfile\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.533173 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.533163 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.533282 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.533222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2d29fc65-9864-4e36-bfb7-97c820068952-root\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.533282 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.533232 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 16:24:25.533282 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.533236 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2d29fc65-9864-4e36-bfb7-97c820068952-sys\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.534090 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.534027 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 16:24:25.534233 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.534214 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 16:24:25.534332 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.534259 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-km7gn\"" Apr 22 16:24:25.540821 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.540803 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9mtcp"] Apr 22 16:24:25.634076 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634027 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.634076 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634079 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.634263 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634127 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2d29fc65-9864-4e36-bfb7-97c820068952-root\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.634263 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634168 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2d29fc65-9864-4e36-bfb7-97c820068952-root\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.634263 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2d29fc65-9864-4e36-bfb7-97c820068952-sys\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.634263 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634210 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-accelerators-collector-config\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.634263 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-tls\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.634263 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634247 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgt9g\" (UniqueName: \"kubernetes.io/projected/2d29fc65-9864-4e36-bfb7-97c820068952-kube-api-access-pgt9g\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.634557 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634308 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2d29fc65-9864-4e36-bfb7-97c820068952-sys\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.634557 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:24:25.634320 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 16:24:25.634557 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634338 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-wtmp\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.634557 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:24:25.634369 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-tls podName:2d29fc65-9864-4e36-bfb7-97c820068952 nodeName:}" failed. No retries permitted until 2026-04-22 16:24:26.134354162 +0000 UTC m=+165.836372249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-tls") pod "node-exporter-zcsck" (UID: "2d29fc65-9864-4e36-bfb7-97c820068952") : secret "node-exporter-tls" not found Apr 22 16:24:25.634557 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634383 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d29fc65-9864-4e36-bfb7-97c820068952-metrics-client-ca\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.634557 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634409 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq8zw\" (UniqueName: \"kubernetes.io/projected/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-kube-api-access-qq8zw\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.634557 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634434 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-wtmp\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.634557 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634445 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.634557 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634493 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.634557 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634536 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.634925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-textfile\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.634925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634591 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.634925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634881 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-textfile\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.634925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.634903 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d29fc65-9864-4e36-bfb7-97c820068952-metrics-client-ca\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.635538 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.635518 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-accelerators-collector-config\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.636889 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.636869 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.644343 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.644318 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgt9g\" (UniqueName: \"kubernetes.io/projected/2d29fc65-9864-4e36-bfb7-97c820068952-kube-api-access-pgt9g\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:25.735975 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.735943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qq8zw\" (UniqueName: \"kubernetes.io/projected/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-kube-api-access-qq8zw\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.735975 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.735982 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.736241 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.736003 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.736241 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.736033 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.736241 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.736110 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.736241 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.736138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.736640 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.736607 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.736924 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.736757 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.736924 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.736771 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.738539 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.738517 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.738624 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.738595 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.743806 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.743785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq8zw\" (UniqueName: \"kubernetes.io/projected/42540c61-86c6-4b6b-9abc-a3e2fd297a0d-kube-api-access-qq8zw\") pod \"kube-state-metrics-69db897b98-9mtcp\" (UID: \"42540c61-86c6-4b6b-9abc-a3e2fd297a0d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.839338 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.839314 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" Apr 22 16:24:25.964005 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:25.963974 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9mtcp"] Apr 22 16:24:25.967025 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:24:25.966998 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42540c61_86c6_4b6b_9abc_a3e2fd297a0d.slice/crio-a310234bcb55f00e305ef10a8e69f9d3fe98ff60a7f392e775c529905046b69e WatchSource:0}: Error finding container a310234bcb55f00e305ef10a8e69f9d3fe98ff60a7f392e775c529905046b69e: Status 404 returned error can't find the container with id a310234bcb55f00e305ef10a8e69f9d3fe98ff60a7f392e775c529905046b69e Apr 22 16:24:26.139324 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.139237 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-tls\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:26.141496 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.141477 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2d29fc65-9864-4e36-bfb7-97c820068952-node-exporter-tls\") pod \"node-exporter-zcsck\" (UID: \"2d29fc65-9864-4e36-bfb7-97c820068952\") " pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:26.323398 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.323350 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" event={"ID":"42540c61-86c6-4b6b-9abc-a3e2fd297a0d","Type":"ContainerStarted","Data":"a310234bcb55f00e305ef10a8e69f9d3fe98ff60a7f392e775c529905046b69e"} Apr 22 16:24:26.398786 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.398714 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zcsck" Apr 22 16:24:26.406596 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:24:26.406573 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d29fc65_9864_4e36_bfb7_97c820068952.slice/crio-e24ebff288676a4f688366240ccc4b3f45022b32e4755a92875af7ca8f59f02a WatchSource:0}: Error finding container e24ebff288676a4f688366240ccc4b3f45022b32e4755a92875af7ca8f59f02a: Status 404 returned error can't find the container with id e24ebff288676a4f688366240ccc4b3f45022b32e4755a92875af7ca8f59f02a Apr 22 16:24:26.580404 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.580374 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:24:26.585127 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.585105 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.587687 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.587663 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 16:24:26.587810 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.587687 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 16:24:26.587810 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.587736 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-rgnw9\"" Apr 22 16:24:26.587810 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.587779 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 16:24:26.587975 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.587831 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 16:24:26.588121 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.588085 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 16:24:26.588121 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.588088 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 16:24:26.588252 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.588135 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 16:24:26.588252 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.588105 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 16:24:26.588252 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.588137 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 16:24:26.599540 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.599512 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:24:26.643414 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.643367 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.643414 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.643410 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1402e546-be18-4ccb-b6bc-785cbbf26bff-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.643597 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.643437 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1402e546-be18-4ccb-b6bc-785cbbf26bff-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.643597 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.643472 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-web-config\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.643597 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.643503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1402e546-be18-4ccb-b6bc-785cbbf26bff-config-out\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.643726 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.643638 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.643726 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.643684 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.643726 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.643708 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-config-volume\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.643855 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.643763 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.643855 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.643803 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1402e546-be18-4ccb-b6bc-785cbbf26bff-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.643933 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.643853 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqjrp\" (UniqueName: \"kubernetes.io/projected/1402e546-be18-4ccb-b6bc-785cbbf26bff-kube-api-access-cqjrp\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.643933 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.643892 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1402e546-be18-4ccb-b6bc-785cbbf26bff-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.643933 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.643924 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.745072 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.745030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.745072 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.745080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1402e546-be18-4ccb-b6bc-785cbbf26bff-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.745292 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.745104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1402e546-be18-4ccb-b6bc-785cbbf26bff-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.745292 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.745124 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-web-config\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.745292 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.745148 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1402e546-be18-4ccb-b6bc-785cbbf26bff-config-out\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.745292 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.745196 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.745292 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:24:26.745208 2575 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 22 16:24:26.745292 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.745228 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.745292 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.745261 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-config-volume\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.745621 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.745300 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.745621 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.745331 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1402e546-be18-4ccb-b6bc-785cbbf26bff-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.745621 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.745370 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqjrp\" (UniqueName: \"kubernetes.io/projected/1402e546-be18-4ccb-b6bc-785cbbf26bff-kube-api-access-cqjrp\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.745621 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.745406 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1402e546-be18-4ccb-b6bc-785cbbf26bff-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.745621 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.745436 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.745621 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.745467 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1402e546-be18-4ccb-b6bc-785cbbf26bff-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.745621 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:24:26.745551 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-main-tls podName:1402e546-be18-4ccb-b6bc-785cbbf26bff nodeName:}" failed. No retries permitted until 2026-04-22 16:24:27.245519355 +0000 UTC m=+166.947537452 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "1402e546-be18-4ccb-b6bc-785cbbf26bff") : secret "alertmanager-main-tls" not found Apr 22 16:24:26.745621 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:24:26.745591 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1402e546-be18-4ccb-b6bc-785cbbf26bff-alertmanager-trusted-ca-bundle podName:1402e546-be18-4ccb-b6bc-785cbbf26bff nodeName:}" failed. No retries permitted until 2026-04-22 16:24:27.245578897 +0000 UTC m=+166.947596984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/1402e546-be18-4ccb-b6bc-785cbbf26bff-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "1402e546-be18-4ccb-b6bc-785cbbf26bff") : configmap references non-existent config key: ca-bundle.crt Apr 22 16:24:26.746219 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.746027 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1402e546-be18-4ccb-b6bc-785cbbf26bff-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.748626 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.748601 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-web-config\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.748792 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.748739 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.748937 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.748912 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1402e546-be18-4ccb-b6bc-785cbbf26bff-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.749340 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.749320 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.749467 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.749431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1402e546-be18-4ccb-b6bc-785cbbf26bff-config-out\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.749518 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.749462 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.750418 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.750400 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-config-volume\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.750560 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.750533 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:26.756670 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:26.756651 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqjrp\" (UniqueName: \"kubernetes.io/projected/1402e546-be18-4ccb-b6bc-785cbbf26bff-kube-api-access-cqjrp\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:27.249880 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:27.249841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1402e546-be18-4ccb-b6bc-785cbbf26bff-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:27.250015 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:27.249927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:27.250672 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:27.250651 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1402e546-be18-4ccb-b6bc-785cbbf26bff-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:27.252352 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:27.252335 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:27.327288 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:27.327251 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zcsck" event={"ID":"2d29fc65-9864-4e36-bfb7-97c820068952","Type":"ContainerStarted","Data":"d94b7de9801b4fdf8a54ea09f70a2f9b0472579a5d460733f4c9cf223920a8ed"} Apr 22 16:24:27.327420 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:27.327305 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zcsck" event={"ID":"2d29fc65-9864-4e36-bfb7-97c820068952","Type":"ContainerStarted","Data":"e24ebff288676a4f688366240ccc4b3f45022b32e4755a92875af7ca8f59f02a"} Apr 22 16:24:27.494008 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:27.493973 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:24:27.620123 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:27.620089 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:24:27.623540 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:24:27.623515 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1402e546_be18_4ccb_b6bc_785cbbf26bff.slice/crio-a7e60bd0d5ef955c9c42d0193a94186603eccc5c8b3b54115fc7a279a3058c6c WatchSource:0}: Error finding container a7e60bd0d5ef955c9c42d0193a94186603eccc5c8b3b54115fc7a279a3058c6c: Status 404 returned error can't find the container with id a7e60bd0d5ef955c9c42d0193a94186603eccc5c8b3b54115fc7a279a3058c6c Apr 22 16:24:28.331244 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.331163 2575 generic.go:358] "Generic (PLEG): container finished" podID="2d29fc65-9864-4e36-bfb7-97c820068952" containerID="d94b7de9801b4fdf8a54ea09f70a2f9b0472579a5d460733f4c9cf223920a8ed" exitCode=0 Apr 22 16:24:28.331400 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.331248 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zcsck" event={"ID":"2d29fc65-9864-4e36-bfb7-97c820068952","Type":"ContainerDied","Data":"d94b7de9801b4fdf8a54ea09f70a2f9b0472579a5d460733f4c9cf223920a8ed"} Apr 22 16:24:28.333202 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.333182 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" event={"ID":"42540c61-86c6-4b6b-9abc-a3e2fd297a0d","Type":"ContainerStarted","Data":"39a0c54b4750273cd1a9662440ddb4cdc036606552c2ab0733063db7d15578d9"} Apr 22 16:24:28.333302 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.333209 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" event={"ID":"42540c61-86c6-4b6b-9abc-a3e2fd297a0d","Type":"ContainerStarted","Data":"d35d700b290fcfcca2833dd57d07ba3364be0d5c5240ecc9eb0f91b9a69fda04"} Apr 22 16:24:28.333302 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.333224 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" event={"ID":"42540c61-86c6-4b6b-9abc-a3e2fd297a0d","Type":"ContainerStarted","Data":"6c4c8e66a1d9d94e806c2225699ced54165699f9736eec0b062618cc5ee7f222"} Apr 22 16:24:28.334333 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.334313 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1402e546-be18-4ccb-b6bc-785cbbf26bff","Type":"ContainerStarted","Data":"a7e60bd0d5ef955c9c42d0193a94186603eccc5c8b3b54115fc7a279a3058c6c"} Apr 22 16:24:28.366585 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.366539 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-9mtcp" podStartSLOduration=1.3567957210000001 podStartE2EDuration="3.366525054s" podCreationTimestamp="2026-04-22 16:24:25 +0000 UTC" firstStartedPulling="2026-04-22 16:24:25.968844127 +0000 UTC m=+165.670862214" lastFinishedPulling="2026-04-22 16:24:27.97857346 +0000 UTC m=+167.680591547" observedRunningTime="2026-04-22 16:24:28.365797504 +0000 UTC m=+168.067815613" watchObservedRunningTime="2026-04-22 16:24:28.366525054 +0000 UTC m=+168.068543162" Apr 22 16:24:28.474486 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.474461 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-78c87765bc-6xw5b"] Apr 22 16:24:28.479269 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.479249 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.482811 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.482582 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 16:24:28.482811 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.482613 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 16:24:28.482811 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.482633 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 16:24:28.482811 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.482654 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 16:24:28.482811 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.482588 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-38vpsqsodg1hf\"" Apr 22 16:24:28.482811 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.482782 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 16:24:28.483190 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.483016 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-p2px7\"" Apr 22 16:24:28.490405 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.490384 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-78c87765bc-6xw5b"] Apr 22 16:24:28.562830 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.562798 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mphc\" (UniqueName: \"kubernetes.io/projected/a81b8fe4-04a8-4d2e-8da1-c17517d508db-kube-api-access-4mphc\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.562830 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.562834 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.563092 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.562870 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a81b8fe4-04a8-4d2e-8da1-c17517d508db-metrics-client-ca\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.563092 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.562897 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-thanos-querier-tls\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.563092 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.562922 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.563092 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.562973 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-grpc-tls\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.563092 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.562991 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.563092 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.563026 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.664527 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.664451 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-thanos-querier-tls\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.664527 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.664490 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.664976 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.664556 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-grpc-tls\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.664976 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.664582 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.664976 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.664630 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.664976 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.664781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mphc\" (UniqueName: \"kubernetes.io/projected/a81b8fe4-04a8-4d2e-8da1-c17517d508db-kube-api-access-4mphc\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.664976 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.664826 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.664976 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.664870 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a81b8fe4-04a8-4d2e-8da1-c17517d508db-metrics-client-ca\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.665750 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.665686 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a81b8fe4-04a8-4d2e-8da1-c17517d508db-metrics-client-ca\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.667894 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.667824 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-thanos-querier-tls\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.667894 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.667856 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-grpc-tls\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.668084 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.667993 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.668297 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.668275 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.669004 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.668982 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.669440 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.669418 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a81b8fe4-04a8-4d2e-8da1-c17517d508db-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.673715 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.673696 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mphc\" (UniqueName: \"kubernetes.io/projected/a81b8fe4-04a8-4d2e-8da1-c17517d508db-kube-api-access-4mphc\") pod \"thanos-querier-78c87765bc-6xw5b\" (UID: \"a81b8fe4-04a8-4d2e-8da1-c17517d508db\") " pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:28.789829 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:28.789796 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:29.104929 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.104903 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-78c87765bc-6xw5b"] Apr 22 16:24:29.107477 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:24:29.107450 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda81b8fe4_04a8_4d2e_8da1_c17517d508db.slice/crio-b073b5fe3db7dba9498cffd65bdf29ba420b01f9d48f2a8233f77ca53bec942f WatchSource:0}: Error finding container b073b5fe3db7dba9498cffd65bdf29ba420b01f9d48f2a8233f77ca53bec942f: Status 404 returned error can't find the container with id b073b5fe3db7dba9498cffd65bdf29ba420b01f9d48f2a8233f77ca53bec942f Apr 22 16:24:29.342677 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.342593 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zcsck" event={"ID":"2d29fc65-9864-4e36-bfb7-97c820068952","Type":"ContainerStarted","Data":"a85c5f592390c5e28d503ba76659c8aa74dc29ad59dd00970d862f2253b4c0c8"} Apr 22 16:24:29.342677 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.342632 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zcsck" event={"ID":"2d29fc65-9864-4e36-bfb7-97c820068952","Type":"ContainerStarted","Data":"b805827f3bf2af20c11c8f14c965d590549dc2bfeea3e0cefed376b445105b41"} Apr 22 16:24:29.343841 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.343818 2575 generic.go:358] "Generic (PLEG): container finished" podID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerID="b41d2466b684874e14388ab0494948dd36595a58450fabcb175932f6380e75da" exitCode=0 Apr 22 16:24:29.343946 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.343871 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1402e546-be18-4ccb-b6bc-785cbbf26bff","Type":"ContainerDied","Data":"b41d2466b684874e14388ab0494948dd36595a58450fabcb175932f6380e75da"} Apr 22 16:24:29.344990 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.344906 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" event={"ID":"a81b8fe4-04a8-4d2e-8da1-c17517d508db","Type":"ContainerStarted","Data":"b073b5fe3db7dba9498cffd65bdf29ba420b01f9d48f2a8233f77ca53bec942f"} Apr 22 16:24:29.367345 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.367299 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zcsck" podStartSLOduration=3.623259784 podStartE2EDuration="4.367286849s" podCreationTimestamp="2026-04-22 16:24:25 +0000 UTC" firstStartedPulling="2026-04-22 16:24:26.408070393 +0000 UTC m=+166.110088480" lastFinishedPulling="2026-04-22 16:24:27.152097457 +0000 UTC m=+166.854115545" observedRunningTime="2026-04-22 16:24:29.366643768 +0000 UTC m=+169.068661877" watchObservedRunningTime="2026-04-22 16:24:29.367286849 +0000 UTC m=+169.069304958" Apr 22 16:24:29.893226 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.893194 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67cbd6dbdd-c8gpz"] Apr 22 16:24:29.896627 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.896605 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:29.899313 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.899279 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 16:24:29.899668 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.899649 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 16:24:29.900251 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.900233 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r9mm9" Apr 22 16:24:29.900349 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.900331 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 16:24:29.900401 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.900368 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 16:24:29.900455 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.900368 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 16:24:29.900501 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.900457 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 16:24:29.901527 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.901506 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-59fnm\"" Apr 22 16:24:29.901611 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.901515 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 16:24:29.903006 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.902984 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pdvf2\"" Apr 22 16:24:29.904649 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.904485 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 16:24:29.909906 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.909887 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67cbd6dbdd-c8gpz"] Apr 22 16:24:29.911454 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.911435 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r9mm9" Apr 22 16:24:29.978453 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.978349 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9sgz\" (UniqueName: \"kubernetes.io/projected/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-kube-api-access-h9sgz\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:29.978453 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.978383 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-config\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:29.978453 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.978417 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-oauth-config\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:29.978453 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.978445 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-service-ca\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:29.978729 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.978534 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-trusted-ca-bundle\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:29.978729 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.978572 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-oauth-serving-cert\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:29.978729 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:29.978640 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-serving-cert\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:30.050497 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.050467 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r9mm9"] Apr 22 16:24:30.053106 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:24:30.053076 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ccac1e5_a013_4728_8544_cd8df005a479.slice/crio-fe11901da2586fed33e34e663fa14b454c0f8ff1892dfa7590d7e81825be0fec WatchSource:0}: Error finding container fe11901da2586fed33e34e663fa14b454c0f8ff1892dfa7590d7e81825be0fec: Status 404 returned error can't find the container with id fe11901da2586fed33e34e663fa14b454c0f8ff1892dfa7590d7e81825be0fec Apr 22 16:24:30.079806 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.079780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-trusted-ca-bundle\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:30.079925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.079823 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-oauth-serving-cert\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:30.079925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.079883 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-serving-cert\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:30.079925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.079913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9sgz\" (UniqueName: \"kubernetes.io/projected/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-kube-api-access-h9sgz\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:30.080712 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.079942 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-config\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:30.080712 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.079972 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-oauth-config\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:30.080712 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.080004 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-service-ca\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:30.080818 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.080780 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-config\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:30.081370 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.081345 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-trusted-ca-bundle\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:30.081509 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.081482 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-oauth-serving-cert\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:30.081841 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.081822 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-service-ca\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:30.083604 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.083584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-serving-cert\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:30.085304 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.085280 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-oauth-config\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:30.089162 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.089141 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9sgz\" (UniqueName: \"kubernetes.io/projected/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-kube-api-access-h9sgz\") pod \"console-67cbd6dbdd-c8gpz\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:30.208908 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.208840 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:30.346922 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.346893 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67cbd6dbdd-c8gpz"] Apr 22 16:24:30.349938 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:30.349914 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r9mm9" event={"ID":"3ccac1e5-a013-4728-8544-cd8df005a479","Type":"ContainerStarted","Data":"fe11901da2586fed33e34e663fa14b454c0f8ff1892dfa7590d7e81825be0fec"} Apr 22 16:24:30.350631 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:24:30.350603 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b3e44ec_208c_4f49_a0f6_7d1bbcad7845.slice/crio-96f4abd1f4377d1707b32f4d015a5d8d41ad20009c09fb3b9ac5955dd2a009f6 WatchSource:0}: Error finding container 96f4abd1f4377d1707b32f4d015a5d8d41ad20009c09fb3b9ac5955dd2a009f6: Status 404 returned error can't find the container with id 96f4abd1f4377d1707b32f4d015a5d8d41ad20009c09fb3b9ac5955dd2a009f6 Apr 22 16:24:31.358848 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:31.358791 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67cbd6dbdd-c8gpz" event={"ID":"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845","Type":"ContainerStarted","Data":"96f4abd1f4377d1707b32f4d015a5d8d41ad20009c09fb3b9ac5955dd2a009f6"} Apr 22 16:24:32.367384 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:32.366471 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1402e546-be18-4ccb-b6bc-785cbbf26bff","Type":"ContainerStarted","Data":"bcff73ee9cb6cbf0cbb00a8612b7c01eb5511eea32d0b4f7f0c3422b66dd45fa"} Apr 22 16:24:32.367384 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:32.366510 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1402e546-be18-4ccb-b6bc-785cbbf26bff","Type":"ContainerStarted","Data":"26168870dae54063284550f3cc1a78fde1aab808c1469c869ee6ebc026f937b2"} Apr 22 16:24:32.370832 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:32.370810 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" event={"ID":"a81b8fe4-04a8-4d2e-8da1-c17517d508db","Type":"ContainerStarted","Data":"057e1d42f7f19187fb7d831fbe12aee7ace1b7a6b718b775a5ec0ec471505153"} Apr 22 16:24:32.372528 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:32.372503 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r9mm9" event={"ID":"3ccac1e5-a013-4728-8544-cd8df005a479","Type":"ContainerStarted","Data":"a9839d8bfdf8b817135c03aae4f1c3c8f2d693b8db0648e5135cae2229523616"} Apr 22 16:24:32.390534 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:32.390481 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r9mm9" podStartSLOduration=137.272703662 podStartE2EDuration="2m19.390463682s" podCreationTimestamp="2026-04-22 16:22:13 +0000 UTC" firstStartedPulling="2026-04-22 16:24:30.05564051 +0000 UTC m=+169.757658601" lastFinishedPulling="2026-04-22 16:24:32.17340052 +0000 UTC m=+171.875418621" observedRunningTime="2026-04-22 16:24:32.388823402 +0000 UTC m=+172.090841512" watchObservedRunningTime="2026-04-22 16:24:32.390463682 +0000 UTC m=+172.092481791" Apr 22 16:24:32.900510 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:32.900466 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:24:33.378721 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:33.378678 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1402e546-be18-4ccb-b6bc-785cbbf26bff","Type":"ContainerStarted","Data":"16e066a714eda9c4256350b72be6242f446538db8444d837e0a64de5d7f0322d"} Apr 22 16:24:33.379147 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:33.378731 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1402e546-be18-4ccb-b6bc-785cbbf26bff","Type":"ContainerStarted","Data":"cab7ad48d64776c27d0ebe6617170a351a0314a6727bbf04279f51489685867e"} Apr 22 16:24:33.379147 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:33.378746 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1402e546-be18-4ccb-b6bc-785cbbf26bff","Type":"ContainerStarted","Data":"589636d9c2946a57b1dac4442719b4ad8a38d8830c2986943745d987ce894c3b"} Apr 22 16:24:33.380805 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:33.380782 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" event={"ID":"a81b8fe4-04a8-4d2e-8da1-c17517d508db","Type":"ContainerStarted","Data":"e1d5466d885baeacfc06a2e59c781a8a05a91d84774e5b3f40b4fbb8a8534736"} Apr 22 16:24:33.380909 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:33.380811 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" event={"ID":"a81b8fe4-04a8-4d2e-8da1-c17517d508db","Type":"ContainerStarted","Data":"436aa53d90679aab0f9907d582c25b702611ff8d531af1573207566e43aa15af"} Apr 22 16:24:34.322287 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:34.322268 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8zkfn" Apr 22 16:24:34.387017 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:34.386942 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1402e546-be18-4ccb-b6bc-785cbbf26bff","Type":"ContainerStarted","Data":"783c368416ee596b185e1f0d6969a92990a9c6312c47b2a94fd4f85b55532808"} Apr 22 16:24:34.389343 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:34.389316 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" event={"ID":"a81b8fe4-04a8-4d2e-8da1-c17517d508db","Type":"ContainerStarted","Data":"8120c51e54976e4693a0098912dbb7d989c0ac7ab91242663719656a52adbd2f"} Apr 22 16:24:34.389435 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:34.389346 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" event={"ID":"a81b8fe4-04a8-4d2e-8da1-c17517d508db","Type":"ContainerStarted","Data":"dcf949c1dc39cf6dd32d659ad419f4146f519380c71455a0bfa9e8791f70e024"} Apr 22 16:24:34.389435 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:34.389359 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" event={"ID":"a81b8fe4-04a8-4d2e-8da1-c17517d508db","Type":"ContainerStarted","Data":"6a2ac1a4d88f8e29ae345b8cfd3f279d17ec310a9cdb5d4c40d04498d3e8e11f"} Apr 22 16:24:34.389514 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:34.389495 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:34.390901 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:34.390883 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67cbd6dbdd-c8gpz" event={"ID":"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845","Type":"ContainerStarted","Data":"55d27c2c1cb3354250ae1d1dae97145e55e77723dd4c1041c3904f93fb00eb97"} Apr 22 16:24:34.415179 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:34.415134 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.899909622 podStartE2EDuration="8.415121137s" podCreationTimestamp="2026-04-22 16:24:26 +0000 UTC" firstStartedPulling="2026-04-22 16:24:27.625314149 +0000 UTC m=+167.327332237" lastFinishedPulling="2026-04-22 16:24:34.140525652 +0000 UTC m=+173.842543752" observedRunningTime="2026-04-22 16:24:34.413637953 +0000 UTC m=+174.115656061" watchObservedRunningTime="2026-04-22 16:24:34.415121137 +0000 UTC m=+174.117139303" Apr 22 16:24:34.439351 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:34.439307 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" podStartSLOduration=1.4071275650000001 podStartE2EDuration="6.439294159s" podCreationTimestamp="2026-04-22 16:24:28 +0000 UTC" firstStartedPulling="2026-04-22 16:24:29.109750669 +0000 UTC m=+168.811768756" lastFinishedPulling="2026-04-22 16:24:34.141917262 +0000 UTC m=+173.843935350" observedRunningTime="2026-04-22 16:24:34.437460721 +0000 UTC m=+174.139478832" watchObservedRunningTime="2026-04-22 16:24:34.439294159 +0000 UTC m=+174.141312268" Apr 22 16:24:34.459768 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:34.459727 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67cbd6dbdd-c8gpz" podStartSLOduration=2.137514437 podStartE2EDuration="5.459714652s" podCreationTimestamp="2026-04-22 16:24:29 +0000 UTC" firstStartedPulling="2026-04-22 16:24:30.352501929 +0000 UTC m=+170.054520032" lastFinishedPulling="2026-04-22 16:24:33.67470216 +0000 UTC m=+173.376720247" observedRunningTime="2026-04-22 16:24:34.457947624 +0000 UTC m=+174.159965732" watchObservedRunningTime="2026-04-22 16:24:34.459714652 +0000 UTC m=+174.161732760" Apr 22 16:24:35.726542 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.726508 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6ffd596d5b-dllvs"] Apr 22 16:24:35.730015 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.729998 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.740302 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.740279 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ffd596d5b-dllvs"] Apr 22 16:24:35.837174 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.837140 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-oauth-config\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.837312 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.837180 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-config\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.837312 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.837206 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-service-ca\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.837312 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.837275 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht6rl\" (UniqueName: \"kubernetes.io/projected/c28eebd3-109e-4aed-9da2-953e0b8fb062-kube-api-access-ht6rl\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.837428 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.837337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-oauth-serving-cert\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.837428 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.837372 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-serving-cert\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.837428 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.837391 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-trusted-ca-bundle\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.937818 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.937783 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-config\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.937818 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.937825 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-service-ca\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.938034 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.937854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ht6rl\" (UniqueName: \"kubernetes.io/projected/c28eebd3-109e-4aed-9da2-953e0b8fb062-kube-api-access-ht6rl\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.938034 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.937900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-oauth-serving-cert\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.938034 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.937943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-serving-cert\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.938034 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.937968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-trusted-ca-bundle\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.938034 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.938002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-oauth-config\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.938562 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.938539 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-service-ca\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.938786 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.938557 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-config\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.938786 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.938687 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-oauth-serving-cert\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.938914 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.938764 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-trusted-ca-bundle\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.940532 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.940514 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-oauth-config\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.940622 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.940589 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-serving-cert\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:35.945932 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:35.945912 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht6rl\" (UniqueName: \"kubernetes.io/projected/c28eebd3-109e-4aed-9da2-953e0b8fb062-kube-api-access-ht6rl\") pod \"console-6ffd596d5b-dllvs\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:36.039746 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:36.039660 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:36.182182 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:36.182157 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ffd596d5b-dllvs"] Apr 22 16:24:36.183861 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:24:36.183831 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc28eebd3_109e_4aed_9da2_953e0b8fb062.slice/crio-8a9f8a5a67cd78565ae9a48caaf6ac928a7f6a76e06c055baa248bea4cd9d95a WatchSource:0}: Error finding container 8a9f8a5a67cd78565ae9a48caaf6ac928a7f6a76e06c055baa248bea4cd9d95a: Status 404 returned error can't find the container with id 8a9f8a5a67cd78565ae9a48caaf6ac928a7f6a76e06c055baa248bea4cd9d95a Apr 22 16:24:36.308671 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:36.308578 2575 patch_prober.go:28] interesting pod/image-registry-659c89dd9c-7zqf4 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 16:24:36.308671 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:36.308629 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" podUID="ada8f77b-6c93-4914-a78d-b753c44deb3e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 16:24:36.399838 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:36.399802 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ffd596d5b-dllvs" event={"ID":"c28eebd3-109e-4aed-9da2-953e0b8fb062","Type":"ContainerStarted","Data":"ff06c2adb3afa1e2468a1d17d33ed787b633c3d62dd0cfee26de98c01dbf8f87"} Apr 22 16:24:36.399838 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:36.399837 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ffd596d5b-dllvs" event={"ID":"c28eebd3-109e-4aed-9da2-953e0b8fb062","Type":"ContainerStarted","Data":"8a9f8a5a67cd78565ae9a48caaf6ac928a7f6a76e06c055baa248bea4cd9d95a"} Apr 22 16:24:36.423676 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:36.423630 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6ffd596d5b-dllvs" podStartSLOduration=1.423616587 podStartE2EDuration="1.423616587s" podCreationTimestamp="2026-04-22 16:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:24:36.420544235 +0000 UTC m=+176.122562345" watchObservedRunningTime="2026-04-22 16:24:36.423616587 +0000 UTC m=+176.125634696" Apr 22 16:24:38.298673 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:38.298649 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-659c89dd9c-7zqf4" Apr 22 16:24:40.209918 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:40.209881 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:40.209918 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:40.209927 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:40.214455 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:40.214424 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:40.401840 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:40.401812 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-78c87765bc-6xw5b" Apr 22 16:24:40.414927 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:40.414901 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:24:46.039829 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:46.039795 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:46.040291 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:46.039842 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:46.044262 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:46.044242 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:46.431955 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:46.431887 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:24:46.480520 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:46.480487 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67cbd6dbdd-c8gpz"] Apr 22 16:24:49.438222 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:49.438191 2575 generic.go:358] "Generic (PLEG): container finished" podID="47b372d5-7432-499c-b5f2-baff8c5f3689" containerID="222cca632b4e750650e6647a784e2a536f7ccb60b4d0a72b5133063791147338" exitCode=0 Apr 22 16:24:49.438604 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:49.438267 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hrwl2" event={"ID":"47b372d5-7432-499c-b5f2-baff8c5f3689","Type":"ContainerDied","Data":"222cca632b4e750650e6647a784e2a536f7ccb60b4d0a72b5133063791147338"} Apr 22 16:24:49.438604 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:49.438579 2575 scope.go:117] "RemoveContainer" containerID="222cca632b4e750650e6647a784e2a536f7ccb60b4d0a72b5133063791147338" Apr 22 16:24:50.442322 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:50.442288 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hrwl2" event={"ID":"47b372d5-7432-499c-b5f2-baff8c5f3689","Type":"ContainerStarted","Data":"ef46d0132a03e5bb56362f1b85cad33c89539ef34e9817ddb0b66416dd653334"} Apr 22 16:24:51.514696 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:51.514663 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8zkfn_e3238d38-c8d6-423c-bfa5-3feb9c21e8bc/dns/0.log" Apr 22 16:24:51.717268 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:51.717239 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8zkfn_e3238d38-c8d6-423c-bfa5-3feb9c21e8bc/kube-rbac-proxy/0.log" Apr 22 16:24:52.115536 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:52.115507 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jg6wx_91576000-c254-43f5-84ba-7029c347da22/dns-node-resolver/0.log" Apr 22 16:24:52.517218 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:52.517192 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6db85cc586-wptmg_de8899ee-ffb9-447d-bfe6-3f4560af10d3/router/0.log" Apr 22 16:24:52.921459 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:52.921380 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-r9mm9_3ccac1e5-a013-4728-8544-cd8df005a479/serve-healthcheck-canary/0.log" Apr 22 16:24:53.316252 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:53.316223 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-2m6w7_2fce02cf-d44f-4c69-80d1-524c6b7fb205/cluster-samples-operator/0.log" Apr 22 16:24:53.517790 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:24:53.517766 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-2m6w7_2fce02cf-d44f-4c69-80d1-524c6b7fb205/cluster-samples-operator-watch/0.log" Apr 22 16:25:03.483419 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:03.483387 2575 generic.go:358] "Generic (PLEG): container finished" podID="b81523a0-77f7-4e1d-9f17-d99fd060b090" containerID="02d2d52b7362666fe89ec5c3bf2f79f572916fd3be4a0c4603ba3f1ac8f17e3d" exitCode=0 Apr 22 16:25:03.483773 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:03.483440 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5" event={"ID":"b81523a0-77f7-4e1d-9f17-d99fd060b090","Type":"ContainerDied","Data":"02d2d52b7362666fe89ec5c3bf2f79f572916fd3be4a0c4603ba3f1ac8f17e3d"} Apr 22 16:25:03.483773 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:03.483763 2575 scope.go:117] "RemoveContainer" containerID="02d2d52b7362666fe89ec5c3bf2f79f572916fd3be4a0c4603ba3f1ac8f17e3d" Apr 22 16:25:04.487578 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:04.487545 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jjjg5" event={"ID":"b81523a0-77f7-4e1d-9f17-d99fd060b090","Type":"ContainerStarted","Data":"f73ee2381fea0c4dbfcb78001158978740e5ad6f2282b281852a353328bf7134"} Apr 22 16:25:11.500546 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.500481 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-67cbd6dbdd-c8gpz" podUID="1b3e44ec-208c-4f49-a0f6-7d1bbcad7845" containerName="console" containerID="cri-o://55d27c2c1cb3354250ae1d1dae97145e55e77723dd4c1041c3904f93fb00eb97" gracePeriod=15 Apr 22 16:25:11.730390 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.730366 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67cbd6dbdd-c8gpz_1b3e44ec-208c-4f49-a0f6-7d1bbcad7845/console/0.log" Apr 22 16:25:11.730493 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.730442 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:25:11.750618 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.750555 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-serving-cert\") pod \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " Apr 22 16:25:11.750618 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.750585 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-oauth-serving-cert\") pod \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " Apr 22 16:25:11.750618 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.750617 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-oauth-config\") pod \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " Apr 22 16:25:11.750837 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.750662 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-trusted-ca-bundle\") pod \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " Apr 22 16:25:11.750837 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.750809 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-config\") pod \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " Apr 22 16:25:11.750946 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.750845 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-service-ca\") pod \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " Apr 22 16:25:11.750946 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.750886 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9sgz\" (UniqueName: \"kubernetes.io/projected/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-kube-api-access-h9sgz\") pod \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\" (UID: \"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845\") " Apr 22 16:25:11.751227 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.751104 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1b3e44ec-208c-4f49-a0f6-7d1bbcad7845" (UID: "1b3e44ec-208c-4f49-a0f6-7d1bbcad7845"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:25:11.751227 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.751205 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1b3e44ec-208c-4f49-a0f6-7d1bbcad7845" (UID: "1b3e44ec-208c-4f49-a0f6-7d1bbcad7845"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:25:11.751559 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.751280 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-service-ca" (OuterVolumeSpecName: "service-ca") pod "1b3e44ec-208c-4f49-a0f6-7d1bbcad7845" (UID: "1b3e44ec-208c-4f49-a0f6-7d1bbcad7845"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:25:11.751962 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.751897 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-trusted-ca-bundle\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:11.751962 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.751918 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-service-ca\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:11.751962 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.751931 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-oauth-serving-cert\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:11.752193 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.752139 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-config" (OuterVolumeSpecName: "console-config") pod "1b3e44ec-208c-4f49-a0f6-7d1bbcad7845" (UID: "1b3e44ec-208c-4f49-a0f6-7d1bbcad7845"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:25:11.753942 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.753601 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1b3e44ec-208c-4f49-a0f6-7d1bbcad7845" (UID: "1b3e44ec-208c-4f49-a0f6-7d1bbcad7845"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:11.753942 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.753669 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1b3e44ec-208c-4f49-a0f6-7d1bbcad7845" (UID: "1b3e44ec-208c-4f49-a0f6-7d1bbcad7845"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:11.754134 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.753996 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-kube-api-access-h9sgz" (OuterVolumeSpecName: "kube-api-access-h9sgz") pod "1b3e44ec-208c-4f49-a0f6-7d1bbcad7845" (UID: "1b3e44ec-208c-4f49-a0f6-7d1bbcad7845"). InnerVolumeSpecName "kube-api-access-h9sgz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:25:11.852332 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.852290 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-config\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:11.852332 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.852329 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h9sgz\" (UniqueName: \"kubernetes.io/projected/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-kube-api-access-h9sgz\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:11.852332 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.852340 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-serving-cert\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:11.852543 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:11.852349 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845-console-oauth-config\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:12.511519 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:12.511494 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67cbd6dbdd-c8gpz_1b3e44ec-208c-4f49-a0f6-7d1bbcad7845/console/0.log" Apr 22 16:25:12.511878 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:12.511537 2575 generic.go:358] "Generic (PLEG): container finished" podID="1b3e44ec-208c-4f49-a0f6-7d1bbcad7845" containerID="55d27c2c1cb3354250ae1d1dae97145e55e77723dd4c1041c3904f93fb00eb97" exitCode=2 Apr 22 16:25:12.511878 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:12.511610 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67cbd6dbdd-c8gpz" Apr 22 16:25:12.511878 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:12.511642 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67cbd6dbdd-c8gpz" event={"ID":"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845","Type":"ContainerDied","Data":"55d27c2c1cb3354250ae1d1dae97145e55e77723dd4c1041c3904f93fb00eb97"} Apr 22 16:25:12.511878 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:12.511689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67cbd6dbdd-c8gpz" event={"ID":"1b3e44ec-208c-4f49-a0f6-7d1bbcad7845","Type":"ContainerDied","Data":"96f4abd1f4377d1707b32f4d015a5d8d41ad20009c09fb3b9ac5955dd2a009f6"} Apr 22 16:25:12.511878 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:12.511707 2575 scope.go:117] "RemoveContainer" containerID="55d27c2c1cb3354250ae1d1dae97145e55e77723dd4c1041c3904f93fb00eb97" Apr 22 16:25:12.520612 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:12.520594 2575 scope.go:117] "RemoveContainer" containerID="55d27c2c1cb3354250ae1d1dae97145e55e77723dd4c1041c3904f93fb00eb97" Apr 22 16:25:12.520848 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:25:12.520828 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55d27c2c1cb3354250ae1d1dae97145e55e77723dd4c1041c3904f93fb00eb97\": container with ID starting with 55d27c2c1cb3354250ae1d1dae97145e55e77723dd4c1041c3904f93fb00eb97 not found: ID does not exist" containerID="55d27c2c1cb3354250ae1d1dae97145e55e77723dd4c1041c3904f93fb00eb97" Apr 22 16:25:12.520921 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:12.520859 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55d27c2c1cb3354250ae1d1dae97145e55e77723dd4c1041c3904f93fb00eb97"} err="failed to get container status \"55d27c2c1cb3354250ae1d1dae97145e55e77723dd4c1041c3904f93fb00eb97\": rpc error: code = NotFound desc = could not find container \"55d27c2c1cb3354250ae1d1dae97145e55e77723dd4c1041c3904f93fb00eb97\": container with ID starting with 55d27c2c1cb3354250ae1d1dae97145e55e77723dd4c1041c3904f93fb00eb97 not found: ID does not exist" Apr 22 16:25:12.535892 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:12.535865 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67cbd6dbdd-c8gpz"] Apr 22 16:25:12.539550 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:12.539532 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-67cbd6dbdd-c8gpz"] Apr 22 16:25:12.904721 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:12.904647 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b3e44ec-208c-4f49-a0f6-7d1bbcad7845" path="/var/lib/kubelet/pods/1b3e44ec-208c-4f49-a0f6-7d1bbcad7845/volumes" Apr 22 16:25:19.531866 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:19.531832 2575 generic.go:358] "Generic (PLEG): container finished" podID="656f9afe-2a94-4ce8-8322-98dccbf691a1" containerID="b5c12b3ad9df14b61020edef0b1953757fbff477b94b782ce990d9a29c5e57ee" exitCode=0 Apr 22 16:25:19.532280 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:19.531877 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr" event={"ID":"656f9afe-2a94-4ce8-8322-98dccbf691a1","Type":"ContainerDied","Data":"b5c12b3ad9df14b61020edef0b1953757fbff477b94b782ce990d9a29c5e57ee"} Apr 22 16:25:19.532280 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:19.532187 2575 scope.go:117] "RemoveContainer" containerID="b5c12b3ad9df14b61020edef0b1953757fbff477b94b782ce990d9a29c5e57ee" Apr 22 16:25:20.536187 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:20.536158 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4n4pr" event={"ID":"656f9afe-2a94-4ce8-8322-98dccbf691a1","Type":"ContainerStarted","Data":"489cca0cc802d6a3a9586d4f8e3e079420b258695f84a1acf4f8e8e7cec9b37d"} Apr 22 16:25:45.793709 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:45.793673 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:25:45.794126 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:45.794079 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="alertmanager" containerID="cri-o://26168870dae54063284550f3cc1a78fde1aab808c1469c869ee6ebc026f937b2" gracePeriod=120 Apr 22 16:25:45.794192 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:45.794138 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="kube-rbac-proxy-metric" containerID="cri-o://16e066a714eda9c4256350b72be6242f446538db8444d837e0a64de5d7f0322d" gracePeriod=120 Apr 22 16:25:45.794242 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:45.794203 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="kube-rbac-proxy" containerID="cri-o://cab7ad48d64776c27d0ebe6617170a351a0314a6727bbf04279f51489685867e" gracePeriod=120 Apr 22 16:25:45.794242 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:45.794194 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="prom-label-proxy" containerID="cri-o://783c368416ee596b185e1f0d6969a92990a9c6312c47b2a94fd4f85b55532808" gracePeriod=120 Apr 22 16:25:45.794332 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:45.794177 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="kube-rbac-proxy-web" containerID="cri-o://589636d9c2946a57b1dac4442719b4ad8a38d8830c2986943745d987ce894c3b" gracePeriod=120 Apr 22 16:25:45.794332 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:45.794198 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="config-reloader" containerID="cri-o://bcff73ee9cb6cbf0cbb00a8612b7c01eb5511eea32d0b4f7f0c3422b66dd45fa" gracePeriod=120 Apr 22 16:25:46.613935 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:46.613903 2575 generic.go:358] "Generic (PLEG): container finished" podID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerID="783c368416ee596b185e1f0d6969a92990a9c6312c47b2a94fd4f85b55532808" exitCode=0 Apr 22 16:25:46.613935 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:46.613928 2575 generic.go:358] "Generic (PLEG): container finished" podID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerID="16e066a714eda9c4256350b72be6242f446538db8444d837e0a64de5d7f0322d" exitCode=0 Apr 22 16:25:46.613935 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:46.613934 2575 generic.go:358] "Generic (PLEG): container finished" podID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerID="cab7ad48d64776c27d0ebe6617170a351a0314a6727bbf04279f51489685867e" exitCode=0 Apr 22 16:25:46.613935 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:46.613939 2575 generic.go:358] "Generic (PLEG): container finished" podID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerID="bcff73ee9cb6cbf0cbb00a8612b7c01eb5511eea32d0b4f7f0c3422b66dd45fa" exitCode=0 Apr 22 16:25:46.613935 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:46.613944 2575 generic.go:358] "Generic (PLEG): container finished" podID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerID="26168870dae54063284550f3cc1a78fde1aab808c1469c869ee6ebc026f937b2" exitCode=0 Apr 22 16:25:46.614242 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:46.613968 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1402e546-be18-4ccb-b6bc-785cbbf26bff","Type":"ContainerDied","Data":"783c368416ee596b185e1f0d6969a92990a9c6312c47b2a94fd4f85b55532808"} Apr 22 16:25:46.614242 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:46.614000 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1402e546-be18-4ccb-b6bc-785cbbf26bff","Type":"ContainerDied","Data":"16e066a714eda9c4256350b72be6242f446538db8444d837e0a64de5d7f0322d"} Apr 22 16:25:46.614242 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:46.614011 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1402e546-be18-4ccb-b6bc-785cbbf26bff","Type":"ContainerDied","Data":"cab7ad48d64776c27d0ebe6617170a351a0314a6727bbf04279f51489685867e"} Apr 22 16:25:46.614242 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:46.614019 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1402e546-be18-4ccb-b6bc-785cbbf26bff","Type":"ContainerDied","Data":"bcff73ee9cb6cbf0cbb00a8612b7c01eb5511eea32d0b4f7f0c3422b66dd45fa"} Apr 22 16:25:46.614242 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:46.614028 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1402e546-be18-4ccb-b6bc-785cbbf26bff","Type":"ContainerDied","Data":"26168870dae54063284550f3cc1a78fde1aab808c1469c869ee6ebc026f937b2"} Apr 22 16:25:47.032789 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.032767 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.153872 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.153840 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1402e546-be18-4ccb-b6bc-785cbbf26bff-metrics-client-ca\") pod \"1402e546-be18-4ccb-b6bc-785cbbf26bff\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " Apr 22 16:25:47.154069 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.153882 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy-metric\") pod \"1402e546-be18-4ccb-b6bc-785cbbf26bff\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " Apr 22 16:25:47.154069 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.153918 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-web-config\") pod \"1402e546-be18-4ccb-b6bc-785cbbf26bff\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " Apr 22 16:25:47.154069 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.153952 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1402e546-be18-4ccb-b6bc-785cbbf26bff-alertmanager-trusted-ca-bundle\") pod \"1402e546-be18-4ccb-b6bc-785cbbf26bff\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " Apr 22 16:25:47.154069 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.153996 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1402e546-be18-4ccb-b6bc-785cbbf26bff-config-out\") pod \"1402e546-be18-4ccb-b6bc-785cbbf26bff\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " Apr 22 16:25:47.154069 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.154028 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqjrp\" (UniqueName: \"kubernetes.io/projected/1402e546-be18-4ccb-b6bc-785cbbf26bff-kube-api-access-cqjrp\") pod \"1402e546-be18-4ccb-b6bc-785cbbf26bff\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " Apr 22 16:25:47.154325 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.154087 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-config-volume\") pod \"1402e546-be18-4ccb-b6bc-785cbbf26bff\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " Apr 22 16:25:47.154325 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.154122 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-main-tls\") pod \"1402e546-be18-4ccb-b6bc-785cbbf26bff\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " Apr 22 16:25:47.154325 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.154150 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-cluster-tls-config\") pod \"1402e546-be18-4ccb-b6bc-785cbbf26bff\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " Apr 22 16:25:47.154325 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.154189 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1402e546-be18-4ccb-b6bc-785cbbf26bff-alertmanager-main-db\") pod \"1402e546-be18-4ccb-b6bc-785cbbf26bff\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " Apr 22 16:25:47.154325 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.154213 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1402e546-be18-4ccb-b6bc-785cbbf26bff-tls-assets\") pod \"1402e546-be18-4ccb-b6bc-785cbbf26bff\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " Apr 22 16:25:47.154325 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.154253 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy-web\") pod \"1402e546-be18-4ccb-b6bc-785cbbf26bff\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " Apr 22 16:25:47.154325 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.154301 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy\") pod \"1402e546-be18-4ccb-b6bc-785cbbf26bff\" (UID: \"1402e546-be18-4ccb-b6bc-785cbbf26bff\") " Apr 22 16:25:47.154325 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.154300 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1402e546-be18-4ccb-b6bc-785cbbf26bff-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "1402e546-be18-4ccb-b6bc-785cbbf26bff" (UID: "1402e546-be18-4ccb-b6bc-785cbbf26bff"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:25:47.154693 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.154376 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1402e546-be18-4ccb-b6bc-785cbbf26bff-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "1402e546-be18-4ccb-b6bc-785cbbf26bff" (UID: "1402e546-be18-4ccb-b6bc-785cbbf26bff"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:25:47.154693 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.154602 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1402e546-be18-4ccb-b6bc-785cbbf26bff-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "1402e546-be18-4ccb-b6bc-785cbbf26bff" (UID: "1402e546-be18-4ccb-b6bc-785cbbf26bff"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:25:47.154804 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.154666 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1402e546-be18-4ccb-b6bc-785cbbf26bff-metrics-client-ca\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:47.154932 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.154825 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1402e546-be18-4ccb-b6bc-785cbbf26bff-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:47.157245 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.157196 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1402e546-be18-4ccb-b6bc-785cbbf26bff-config-out" (OuterVolumeSpecName: "config-out") pod "1402e546-be18-4ccb-b6bc-785cbbf26bff" (UID: "1402e546-be18-4ccb-b6bc-785cbbf26bff"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:25:47.157406 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.157361 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "1402e546-be18-4ccb-b6bc-785cbbf26bff" (UID: "1402e546-be18-4ccb-b6bc-785cbbf26bff"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:47.157684 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.157561 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1402e546-be18-4ccb-b6bc-785cbbf26bff-kube-api-access-cqjrp" (OuterVolumeSpecName: "kube-api-access-cqjrp") pod "1402e546-be18-4ccb-b6bc-785cbbf26bff" (UID: "1402e546-be18-4ccb-b6bc-785cbbf26bff"). InnerVolumeSpecName "kube-api-access-cqjrp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:25:47.157684 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.157653 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "1402e546-be18-4ccb-b6bc-785cbbf26bff" (UID: "1402e546-be18-4ccb-b6bc-785cbbf26bff"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:47.157798 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.157681 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "1402e546-be18-4ccb-b6bc-785cbbf26bff" (UID: "1402e546-be18-4ccb-b6bc-785cbbf26bff"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:47.157798 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.157705 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1402e546-be18-4ccb-b6bc-785cbbf26bff-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1402e546-be18-4ccb-b6bc-785cbbf26bff" (UID: "1402e546-be18-4ccb-b6bc-785cbbf26bff"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:25:47.158652 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.158631 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-config-volume" (OuterVolumeSpecName: "config-volume") pod "1402e546-be18-4ccb-b6bc-785cbbf26bff" (UID: "1402e546-be18-4ccb-b6bc-785cbbf26bff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:47.159006 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.158979 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "1402e546-be18-4ccb-b6bc-785cbbf26bff" (UID: "1402e546-be18-4ccb-b6bc-785cbbf26bff"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:47.161944 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.161871 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "1402e546-be18-4ccb-b6bc-785cbbf26bff" (UID: "1402e546-be18-4ccb-b6bc-785cbbf26bff"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:47.168157 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.168135 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-web-config" (OuterVolumeSpecName: "web-config") pod "1402e546-be18-4ccb-b6bc-785cbbf26bff" (UID: "1402e546-be18-4ccb-b6bc-785cbbf26bff"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:47.255869 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.255836 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:47.255869 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.255862 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:47.255869 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.255875 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-web-config\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:47.256112 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.255884 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1402e546-be18-4ccb-b6bc-785cbbf26bff-config-out\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:47.256112 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.255892 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cqjrp\" (UniqueName: \"kubernetes.io/projected/1402e546-be18-4ccb-b6bc-785cbbf26bff-kube-api-access-cqjrp\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:47.256112 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.255900 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-config-volume\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:47.256112 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.255909 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-main-tls\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:47.256112 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.255918 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-cluster-tls-config\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:47.256112 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.255926 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1402e546-be18-4ccb-b6bc-785cbbf26bff-alertmanager-main-db\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:47.256112 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.255934 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1402e546-be18-4ccb-b6bc-785cbbf26bff-tls-assets\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:47.256112 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.255943 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1402e546-be18-4ccb-b6bc-785cbbf26bff-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:25:47.620158 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.620121 2575 generic.go:358] "Generic (PLEG): container finished" podID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerID="589636d9c2946a57b1dac4442719b4ad8a38d8830c2986943745d987ce894c3b" exitCode=0 Apr 22 16:25:47.620312 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.620195 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1402e546-be18-4ccb-b6bc-785cbbf26bff","Type":"ContainerDied","Data":"589636d9c2946a57b1dac4442719b4ad8a38d8830c2986943745d987ce894c3b"} Apr 22 16:25:47.620312 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.620233 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1402e546-be18-4ccb-b6bc-785cbbf26bff","Type":"ContainerDied","Data":"a7e60bd0d5ef955c9c42d0193a94186603eccc5c8b3b54115fc7a279a3058c6c"} Apr 22 16:25:47.620312 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.620249 2575 scope.go:117] "RemoveContainer" containerID="783c368416ee596b185e1f0d6969a92990a9c6312c47b2a94fd4f85b55532808" Apr 22 16:25:47.620312 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.620263 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.627819 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.627661 2575 scope.go:117] "RemoveContainer" containerID="16e066a714eda9c4256350b72be6242f446538db8444d837e0a64de5d7f0322d" Apr 22 16:25:47.634499 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.634481 2575 scope.go:117] "RemoveContainer" containerID="cab7ad48d64776c27d0ebe6617170a351a0314a6727bbf04279f51489685867e" Apr 22 16:25:47.640675 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.640653 2575 scope.go:117] "RemoveContainer" containerID="589636d9c2946a57b1dac4442719b4ad8a38d8830c2986943745d987ce894c3b" Apr 22 16:25:47.642411 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.642349 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:25:47.647605 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.647585 2575 scope.go:117] "RemoveContainer" containerID="bcff73ee9cb6cbf0cbb00a8612b7c01eb5511eea32d0b4f7f0c3422b66dd45fa" Apr 22 16:25:47.648749 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.648729 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:25:47.653771 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.653754 2575 scope.go:117] "RemoveContainer" containerID="26168870dae54063284550f3cc1a78fde1aab808c1469c869ee6ebc026f937b2" Apr 22 16:25:47.660001 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.659982 2575 scope.go:117] "RemoveContainer" containerID="b41d2466b684874e14388ab0494948dd36595a58450fabcb175932f6380e75da" Apr 22 16:25:47.666021 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.666006 2575 scope.go:117] "RemoveContainer" containerID="783c368416ee596b185e1f0d6969a92990a9c6312c47b2a94fd4f85b55532808" Apr 22 16:25:47.666291 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:25:47.666264 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"783c368416ee596b185e1f0d6969a92990a9c6312c47b2a94fd4f85b55532808\": container with ID starting with 783c368416ee596b185e1f0d6969a92990a9c6312c47b2a94fd4f85b55532808 not found: ID does not exist" containerID="783c368416ee596b185e1f0d6969a92990a9c6312c47b2a94fd4f85b55532808" Apr 22 16:25:47.666373 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.666292 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"783c368416ee596b185e1f0d6969a92990a9c6312c47b2a94fd4f85b55532808"} err="failed to get container status \"783c368416ee596b185e1f0d6969a92990a9c6312c47b2a94fd4f85b55532808\": rpc error: code = NotFound desc = could not find container \"783c368416ee596b185e1f0d6969a92990a9c6312c47b2a94fd4f85b55532808\": container with ID starting with 783c368416ee596b185e1f0d6969a92990a9c6312c47b2a94fd4f85b55532808 not found: ID does not exist" Apr 22 16:25:47.666373 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.666313 2575 scope.go:117] "RemoveContainer" containerID="16e066a714eda9c4256350b72be6242f446538db8444d837e0a64de5d7f0322d" Apr 22 16:25:47.666585 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:25:47.666563 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16e066a714eda9c4256350b72be6242f446538db8444d837e0a64de5d7f0322d\": container with ID starting with 16e066a714eda9c4256350b72be6242f446538db8444d837e0a64de5d7f0322d not found: ID does not exist" containerID="16e066a714eda9c4256350b72be6242f446538db8444d837e0a64de5d7f0322d" Apr 22 16:25:47.666620 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.666592 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16e066a714eda9c4256350b72be6242f446538db8444d837e0a64de5d7f0322d"} err="failed to get container status \"16e066a714eda9c4256350b72be6242f446538db8444d837e0a64de5d7f0322d\": rpc error: code = NotFound desc = could not find container \"16e066a714eda9c4256350b72be6242f446538db8444d837e0a64de5d7f0322d\": container with ID starting with 16e066a714eda9c4256350b72be6242f446538db8444d837e0a64de5d7f0322d not found: ID does not exist" Apr 22 16:25:47.666620 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.666608 2575 scope.go:117] "RemoveContainer" containerID="cab7ad48d64776c27d0ebe6617170a351a0314a6727bbf04279f51489685867e" Apr 22 16:25:47.666794 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:25:47.666779 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cab7ad48d64776c27d0ebe6617170a351a0314a6727bbf04279f51489685867e\": container with ID starting with cab7ad48d64776c27d0ebe6617170a351a0314a6727bbf04279f51489685867e not found: ID does not exist" containerID="cab7ad48d64776c27d0ebe6617170a351a0314a6727bbf04279f51489685867e" Apr 22 16:25:47.666836 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.666798 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cab7ad48d64776c27d0ebe6617170a351a0314a6727bbf04279f51489685867e"} err="failed to get container status \"cab7ad48d64776c27d0ebe6617170a351a0314a6727bbf04279f51489685867e\": rpc error: code = NotFound desc = could not find container \"cab7ad48d64776c27d0ebe6617170a351a0314a6727bbf04279f51489685867e\": container with ID starting with cab7ad48d64776c27d0ebe6617170a351a0314a6727bbf04279f51489685867e not found: ID does not exist" Apr 22 16:25:47.666836 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.666812 2575 scope.go:117] "RemoveContainer" containerID="589636d9c2946a57b1dac4442719b4ad8a38d8830c2986943745d987ce894c3b" Apr 22 16:25:47.666990 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:25:47.666977 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"589636d9c2946a57b1dac4442719b4ad8a38d8830c2986943745d987ce894c3b\": container with ID starting with 589636d9c2946a57b1dac4442719b4ad8a38d8830c2986943745d987ce894c3b not found: ID does not exist" containerID="589636d9c2946a57b1dac4442719b4ad8a38d8830c2986943745d987ce894c3b" Apr 22 16:25:47.667032 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.666993 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589636d9c2946a57b1dac4442719b4ad8a38d8830c2986943745d987ce894c3b"} err="failed to get container status \"589636d9c2946a57b1dac4442719b4ad8a38d8830c2986943745d987ce894c3b\": rpc error: code = NotFound desc = could not find container \"589636d9c2946a57b1dac4442719b4ad8a38d8830c2986943745d987ce894c3b\": container with ID starting with 589636d9c2946a57b1dac4442719b4ad8a38d8830c2986943745d987ce894c3b not found: ID does not exist" Apr 22 16:25:47.667032 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.667004 2575 scope.go:117] "RemoveContainer" containerID="bcff73ee9cb6cbf0cbb00a8612b7c01eb5511eea32d0b4f7f0c3422b66dd45fa" Apr 22 16:25:47.667278 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:25:47.667261 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcff73ee9cb6cbf0cbb00a8612b7c01eb5511eea32d0b4f7f0c3422b66dd45fa\": container with ID starting with bcff73ee9cb6cbf0cbb00a8612b7c01eb5511eea32d0b4f7f0c3422b66dd45fa not found: ID does not exist" containerID="bcff73ee9cb6cbf0cbb00a8612b7c01eb5511eea32d0b4f7f0c3422b66dd45fa" Apr 22 16:25:47.667346 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.667285 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcff73ee9cb6cbf0cbb00a8612b7c01eb5511eea32d0b4f7f0c3422b66dd45fa"} err="failed to get container status \"bcff73ee9cb6cbf0cbb00a8612b7c01eb5511eea32d0b4f7f0c3422b66dd45fa\": rpc error: code = NotFound desc = could not find container \"bcff73ee9cb6cbf0cbb00a8612b7c01eb5511eea32d0b4f7f0c3422b66dd45fa\": container with ID starting with bcff73ee9cb6cbf0cbb00a8612b7c01eb5511eea32d0b4f7f0c3422b66dd45fa not found: ID does not exist" Apr 22 16:25:47.667346 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.667304 2575 scope.go:117] "RemoveContainer" containerID="26168870dae54063284550f3cc1a78fde1aab808c1469c869ee6ebc026f937b2" Apr 22 16:25:47.667546 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:25:47.667528 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26168870dae54063284550f3cc1a78fde1aab808c1469c869ee6ebc026f937b2\": container with ID starting with 26168870dae54063284550f3cc1a78fde1aab808c1469c869ee6ebc026f937b2 not found: ID does not exist" containerID="26168870dae54063284550f3cc1a78fde1aab808c1469c869ee6ebc026f937b2" Apr 22 16:25:47.667591 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.667552 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26168870dae54063284550f3cc1a78fde1aab808c1469c869ee6ebc026f937b2"} err="failed to get container status \"26168870dae54063284550f3cc1a78fde1aab808c1469c869ee6ebc026f937b2\": rpc error: code = NotFound desc = could not find container \"26168870dae54063284550f3cc1a78fde1aab808c1469c869ee6ebc026f937b2\": container with ID starting with 26168870dae54063284550f3cc1a78fde1aab808c1469c869ee6ebc026f937b2 not found: ID does not exist" Apr 22 16:25:47.667591 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.667572 2575 scope.go:117] "RemoveContainer" containerID="b41d2466b684874e14388ab0494948dd36595a58450fabcb175932f6380e75da" Apr 22 16:25:47.667764 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:25:47.667747 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b41d2466b684874e14388ab0494948dd36595a58450fabcb175932f6380e75da\": container with ID starting with b41d2466b684874e14388ab0494948dd36595a58450fabcb175932f6380e75da not found: ID does not exist" containerID="b41d2466b684874e14388ab0494948dd36595a58450fabcb175932f6380e75da" Apr 22 16:25:47.667810 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.667769 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b41d2466b684874e14388ab0494948dd36595a58450fabcb175932f6380e75da"} err="failed to get container status \"b41d2466b684874e14388ab0494948dd36595a58450fabcb175932f6380e75da\": rpc error: code = NotFound desc = could not find container \"b41d2466b684874e14388ab0494948dd36595a58450fabcb175932f6380e75da\": container with ID starting with b41d2466b684874e14388ab0494948dd36595a58450fabcb175932f6380e75da not found: ID does not exist" Apr 22 16:25:47.676766 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.676742 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:25:47.677030 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677019 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="alertmanager" Apr 22 16:25:47.677097 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677032 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="alertmanager" Apr 22 16:25:47.677097 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677081 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="kube-rbac-proxy" Apr 22 16:25:47.677097 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677088 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="kube-rbac-proxy" Apr 22 16:25:47.677097 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677095 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b3e44ec-208c-4f49-a0f6-7d1bbcad7845" containerName="console" Apr 22 16:25:47.677206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677101 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3e44ec-208c-4f49-a0f6-7d1bbcad7845" containerName="console" Apr 22 16:25:47.677206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677109 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="config-reloader" Apr 22 16:25:47.677206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677115 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="config-reloader" Apr 22 16:25:47.677206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677126 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="kube-rbac-proxy-web" Apr 22 16:25:47.677206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677131 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="kube-rbac-proxy-web" Apr 22 16:25:47.677206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677138 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="prom-label-proxy" Apr 22 16:25:47.677206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677143 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="prom-label-proxy" Apr 22 16:25:47.677206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677150 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="init-config-reloader" Apr 22 16:25:47.677206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677155 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="init-config-reloader" Apr 22 16:25:47.677206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677160 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="kube-rbac-proxy-metric" Apr 22 16:25:47.677206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677165 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="kube-rbac-proxy-metric" Apr 22 16:25:47.677206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677210 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="kube-rbac-proxy-metric" Apr 22 16:25:47.677506 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677221 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="alertmanager" Apr 22 16:25:47.677506 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677228 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b3e44ec-208c-4f49-a0f6-7d1bbcad7845" containerName="console" Apr 22 16:25:47.677506 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677236 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="config-reloader" Apr 22 16:25:47.677506 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677242 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="prom-label-proxy" Apr 22 16:25:47.677506 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677247 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="kube-rbac-proxy-web" Apr 22 16:25:47.677506 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.677253 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" containerName="kube-rbac-proxy" Apr 22 16:25:47.679541 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.679527 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.681877 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.681860 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 16:25:47.681877 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.681876 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 16:25:47.682026 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.681867 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 16:25:47.682026 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.681923 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 16:25:47.682026 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.681923 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 16:25:47.682309 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.682296 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 16:25:47.682368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.682338 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 16:25:47.682442 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.682424 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 16:25:47.682531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.682516 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-rgnw9\"" Apr 22 16:25:47.688218 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.688200 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 16:25:47.694276 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.694255 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:25:47.861600 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.861574 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/02918e16-1e6c-44cd-9fc0-2e3caac620b6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.861718 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.861609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/02918e16-1e6c-44cd-9fc0-2e3caac620b6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.861718 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.861647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.861718 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.861665 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.861718 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.861682 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-config-volume\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.861718 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.861697 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02918e16-1e6c-44cd-9fc0-2e3caac620b6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.861718 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.861714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.861904 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.861767 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02918e16-1e6c-44cd-9fc0-2e3caac620b6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.861904 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.861817 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.861904 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.861854 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/02918e16-1e6c-44cd-9fc0-2e3caac620b6-config-out\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.861999 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.861900 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.861999 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.861920 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-web-config\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.861999 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.861947 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7qv2\" (UniqueName: \"kubernetes.io/projected/02918e16-1e6c-44cd-9fc0-2e3caac620b6-kube-api-access-l7qv2\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.962572 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.962542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/02918e16-1e6c-44cd-9fc0-2e3caac620b6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.962730 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.962575 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/02918e16-1e6c-44cd-9fc0-2e3caac620b6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.962730 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.962607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.962730 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.962633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.962870 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.962748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-config-volume\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.962870 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.962776 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02918e16-1e6c-44cd-9fc0-2e3caac620b6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.962870 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.962806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.962870 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.962842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02918e16-1e6c-44cd-9fc0-2e3caac620b6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.963155 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.962870 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.963155 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.962923 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/02918e16-1e6c-44cd-9fc0-2e3caac620b6-config-out\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.963155 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.962959 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.963155 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.962983 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-web-config\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.963155 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.963004 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/02918e16-1e6c-44cd-9fc0-2e3caac620b6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.963155 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.963014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7qv2\" (UniqueName: \"kubernetes.io/projected/02918e16-1e6c-44cd-9fc0-2e3caac620b6-kube-api-access-l7qv2\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.963875 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.963845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02918e16-1e6c-44cd-9fc0-2e3caac620b6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.964839 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.964790 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02918e16-1e6c-44cd-9fc0-2e3caac620b6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.965719 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.965627 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/02918e16-1e6c-44cd-9fc0-2e3caac620b6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.965898 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.965874 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.966088 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.966015 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-config-volume\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.966198 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.966111 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.966198 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.966144 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.966987 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.966963 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.967219 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.967199 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.967415 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.967400 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/02918e16-1e6c-44cd-9fc0-2e3caac620b6-config-out\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.967772 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.967758 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/02918e16-1e6c-44cd-9fc0-2e3caac620b6-web-config\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.974743 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.974719 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7qv2\" (UniqueName: \"kubernetes.io/projected/02918e16-1e6c-44cd-9fc0-2e3caac620b6-kube-api-access-l7qv2\") pod \"alertmanager-main-0\" (UID: \"02918e16-1e6c-44cd-9fc0-2e3caac620b6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:47.988633 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:47.988615 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 16:25:48.118257 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:48.118230 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 16:25:48.120242 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:25:48.120219 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02918e16_1e6c_44cd_9fc0_2e3caac620b6.slice/crio-71dcba4913b1c6779c32dee0a1135b61ec4a24dca646dbddc640ed0620480a93 WatchSource:0}: Error finding container 71dcba4913b1c6779c32dee0a1135b61ec4a24dca646dbddc640ed0620480a93: Status 404 returned error can't find the container with id 71dcba4913b1c6779c32dee0a1135b61ec4a24dca646dbddc640ed0620480a93 Apr 22 16:25:48.624884 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:48.624848 2575 generic.go:358] "Generic (PLEG): container finished" podID="02918e16-1e6c-44cd-9fc0-2e3caac620b6" containerID="d9ea6c552588edc97f3046b6764ebc22d8baa3a46f7d3585c17c9989552e82f6" exitCode=0 Apr 22 16:25:48.625072 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:48.624904 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"02918e16-1e6c-44cd-9fc0-2e3caac620b6","Type":"ContainerDied","Data":"d9ea6c552588edc97f3046b6764ebc22d8baa3a46f7d3585c17c9989552e82f6"} Apr 22 16:25:48.625072 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:48.624932 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"02918e16-1e6c-44cd-9fc0-2e3caac620b6","Type":"ContainerStarted","Data":"71dcba4913b1c6779c32dee0a1135b61ec4a24dca646dbddc640ed0620480a93"} Apr 22 16:25:48.905963 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:48.905937 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1402e546-be18-4ccb-b6bc-785cbbf26bff" path="/var/lib/kubelet/pods/1402e546-be18-4ccb-b6bc-785cbbf26bff/volumes" Apr 22 16:25:49.634804 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.634768 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"02918e16-1e6c-44cd-9fc0-2e3caac620b6","Type":"ContainerStarted","Data":"2ac6aaa21997b81d0564e85d41993f7b7b43936edef5198cb166038c33162ef5"} Apr 22 16:25:49.634804 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.634803 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"02918e16-1e6c-44cd-9fc0-2e3caac620b6","Type":"ContainerStarted","Data":"ff998060eb4e6ffd9408421fe9d5565ac370ce64c9abb1ba1437629cb10312a3"} Apr 22 16:25:49.634804 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.634812 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"02918e16-1e6c-44cd-9fc0-2e3caac620b6","Type":"ContainerStarted","Data":"9b3d907585088c29c08a345a45363d05edf06299a0061dd05bb51e9e3a1f1be7"} Apr 22 16:25:49.635246 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.634820 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"02918e16-1e6c-44cd-9fc0-2e3caac620b6","Type":"ContainerStarted","Data":"d458a67faee173aa6ad45e74784f7747987bd82c2047c6ab93e9d85a0f3fcf7b"} Apr 22 16:25:49.635246 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.634829 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"02918e16-1e6c-44cd-9fc0-2e3caac620b6","Type":"ContainerStarted","Data":"f852e92feb25e27387cd95abdc7a6463ada06fce1878d6f0bb1cd57ff1dc825c"} Apr 22 16:25:49.635246 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.634836 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"02918e16-1e6c-44cd-9fc0-2e3caac620b6","Type":"ContainerStarted","Data":"de1872bec699f263522a208535c8358025e6b29c7ff96f3d9b5594860719b56a"} Apr 22 16:25:49.666525 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.666480 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.666467269 podStartE2EDuration="2.666467269s" podCreationTimestamp="2026-04-22 16:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:25:49.664206057 +0000 UTC m=+249.366224166" watchObservedRunningTime="2026-04-22 16:25:49.666467269 +0000 UTC m=+249.368485378" Apr 22 16:25:49.830945 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.830911 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-54f4944c8f-wns2w"] Apr 22 16:25:49.834092 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.834076 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.836384 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.836356 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 16:25:49.836498 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.836357 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 16:25:49.836498 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.836399 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 16:25:49.836498 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.836422 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 16:25:49.836771 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.836753 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 16:25:49.836947 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.836932 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-vzrmj\"" Apr 22 16:25:49.841497 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.841476 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 16:25:49.851063 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.848930 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-54f4944c8f-wns2w"] Apr 22 16:25:49.880705 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.880676 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69e0156f-0f0d-42db-8022-313f141d4dc1-serving-certs-ca-bundle\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.880802 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.880739 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/69e0156f-0f0d-42db-8022-313f141d4dc1-telemeter-client-tls\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.880802 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.880785 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/69e0156f-0f0d-42db-8022-313f141d4dc1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.880884 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.880801 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69e0156f-0f0d-42db-8022-313f141d4dc1-metrics-client-ca\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.880926 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.880877 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/69e0156f-0f0d-42db-8022-313f141d4dc1-secret-telemeter-client\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.880926 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.880912 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/69e0156f-0f0d-42db-8022-313f141d4dc1-federate-client-tls\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.880984 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.880933 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69e0156f-0f0d-42db-8022-313f141d4dc1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.880984 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.880949 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qzd9\" (UniqueName: \"kubernetes.io/projected/69e0156f-0f0d-42db-8022-313f141d4dc1-kube-api-access-9qzd9\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.981393 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.981365 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69e0156f-0f0d-42db-8022-313f141d4dc1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.981505 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.981395 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qzd9\" (UniqueName: \"kubernetes.io/projected/69e0156f-0f0d-42db-8022-313f141d4dc1-kube-api-access-9qzd9\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.981505 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.981420 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69e0156f-0f0d-42db-8022-313f141d4dc1-serving-certs-ca-bundle\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.981696 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.981672 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/69e0156f-0f0d-42db-8022-313f141d4dc1-telemeter-client-tls\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.981754 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.981737 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/69e0156f-0f0d-42db-8022-313f141d4dc1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.981791 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.981774 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69e0156f-0f0d-42db-8022-313f141d4dc1-metrics-client-ca\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.981844 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.981830 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/69e0156f-0f0d-42db-8022-313f141d4dc1-secret-telemeter-client\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.981892 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.981876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/69e0156f-0f0d-42db-8022-313f141d4dc1-federate-client-tls\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.981994 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.981977 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69e0156f-0f0d-42db-8022-313f141d4dc1-serving-certs-ca-bundle\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.982337 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.982310 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69e0156f-0f0d-42db-8022-313f141d4dc1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.982672 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.982652 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69e0156f-0f0d-42db-8022-313f141d4dc1-metrics-client-ca\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.984327 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.984297 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/69e0156f-0f0d-42db-8022-313f141d4dc1-secret-telemeter-client\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.984410 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.984328 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/69e0156f-0f0d-42db-8022-313f141d4dc1-telemeter-client-tls\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.984757 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.984739 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/69e0156f-0f0d-42db-8022-313f141d4dc1-federate-client-tls\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.984814 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.984800 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/69e0156f-0f0d-42db-8022-313f141d4dc1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:49.989927 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:49.989908 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qzd9\" (UniqueName: \"kubernetes.io/projected/69e0156f-0f0d-42db-8022-313f141d4dc1-kube-api-access-9qzd9\") pod \"telemeter-client-54f4944c8f-wns2w\" (UID: \"69e0156f-0f0d-42db-8022-313f141d4dc1\") " pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:50.143626 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:50.143580 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" Apr 22 16:25:50.264813 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:50.264611 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-54f4944c8f-wns2w"] Apr 22 16:25:50.267280 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:25:50.267253 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69e0156f_0f0d_42db_8022_313f141d4dc1.slice/crio-792e13108ad4f1a0dea6e8f1e6207a2ecbf7c9612ce4d3871dad80cfde55b3a0 WatchSource:0}: Error finding container 792e13108ad4f1a0dea6e8f1e6207a2ecbf7c9612ce4d3871dad80cfde55b3a0: Status 404 returned error can't find the container with id 792e13108ad4f1a0dea6e8f1e6207a2ecbf7c9612ce4d3871dad80cfde55b3a0 Apr 22 16:25:50.639841 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:50.639760 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" event={"ID":"69e0156f-0f0d-42db-8022-313f141d4dc1","Type":"ContainerStarted","Data":"792e13108ad4f1a0dea6e8f1e6207a2ecbf7c9612ce4d3871dad80cfde55b3a0"} Apr 22 16:25:52.608326 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:52.608292 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs\") pod \"network-metrics-daemon-5wqw7\" (UID: \"09f37d35-30d1-4fc0-a88f-3514e6c16586\") " pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:25:52.610606 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:52.610584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09f37d35-30d1-4fc0-a88f-3514e6c16586-metrics-certs\") pod \"network-metrics-daemon-5wqw7\" (UID: \"09f37d35-30d1-4fc0-a88f-3514e6c16586\") " pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:25:52.648149 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:52.648115 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" event={"ID":"69e0156f-0f0d-42db-8022-313f141d4dc1","Type":"ContainerStarted","Data":"0515a37d92c9bf53bae8472ec9b7ca201829f040ae2d6dbf434e758f14670627"} Apr 22 16:25:52.648149 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:52.648149 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" event={"ID":"69e0156f-0f0d-42db-8022-313f141d4dc1","Type":"ContainerStarted","Data":"bcdf32d703711b89bdf90ce4d14057038bcf7347e5853c1f984f0cfdd20d2432"} Apr 22 16:25:52.703660 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:52.703631 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dv4l7\"" Apr 22 16:25:52.711713 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:52.711692 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5wqw7" Apr 22 16:25:52.831073 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:52.830917 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5wqw7"] Apr 22 16:25:52.833477 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:25:52.833457 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09f37d35_30d1_4fc0_a88f_3514e6c16586.slice/crio-6290b6a25c46d5a74d6c8c17acc8637a226b1d069b0f6c35966c01e5918c095d WatchSource:0}: Error finding container 6290b6a25c46d5a74d6c8c17acc8637a226b1d069b0f6c35966c01e5918c095d: Status 404 returned error can't find the container with id 6290b6a25c46d5a74d6c8c17acc8637a226b1d069b0f6c35966c01e5918c095d Apr 22 16:25:53.652600 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:53.652568 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5wqw7" event={"ID":"09f37d35-30d1-4fc0-a88f-3514e6c16586","Type":"ContainerStarted","Data":"6290b6a25c46d5a74d6c8c17acc8637a226b1d069b0f6c35966c01e5918c095d"} Apr 22 16:25:53.654705 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:53.654667 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" event={"ID":"69e0156f-0f0d-42db-8022-313f141d4dc1","Type":"ContainerStarted","Data":"2e460fb03746631498ff35b1dbaf22d79be3aa7499a85f7bf2f199d97250e06b"} Apr 22 16:25:53.677935 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:53.677880 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-54f4944c8f-wns2w" podStartSLOduration=2.4375643670000002 podStartE2EDuration="4.677861271s" podCreationTimestamp="2026-04-22 16:25:49 +0000 UTC" firstStartedPulling="2026-04-22 16:25:50.269086701 +0000 UTC m=+249.971104789" lastFinishedPulling="2026-04-22 16:25:52.509383603 +0000 UTC m=+252.211401693" observedRunningTime="2026-04-22 16:25:53.675053116 +0000 UTC m=+253.377071223" watchObservedRunningTime="2026-04-22 16:25:53.677861271 +0000 UTC m=+253.379879381" Apr 22 16:25:54.216187 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.216155 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-dc86597d7-fjv2f"] Apr 22 16:25:54.219466 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.219446 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.231607 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.231582 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dc86597d7-fjv2f"] Apr 22 16:25:54.323317 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.323285 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-oauth-config\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.323317 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.323315 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-trusted-ca-bundle\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.323512 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.323346 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-serving-cert\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.323512 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.323426 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-service-ca\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.323512 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.323495 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-oauth-serving-cert\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.323596 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.323524 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db8r8\" (UniqueName: \"kubernetes.io/projected/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-kube-api-access-db8r8\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.323596 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.323544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-config\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.424689 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.424641 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-service-ca\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.424863 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.424702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-oauth-serving-cert\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.424863 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.424734 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-db8r8\" (UniqueName: \"kubernetes.io/projected/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-kube-api-access-db8r8\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.424863 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.424752 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-config\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.424863 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.424796 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-oauth-config\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.425025 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.424931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-trusted-ca-bundle\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.425025 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.424993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-serving-cert\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.425484 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.425458 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-service-ca\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.425598 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.425465 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-oauth-serving-cert\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.425714 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.425693 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-config\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.425823 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.425804 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-trusted-ca-bundle\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.427449 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.427417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-oauth-config\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.427550 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.427496 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-serving-cert\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.433545 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.433523 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-db8r8\" (UniqueName: \"kubernetes.io/projected/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-kube-api-access-db8r8\") pod \"console-dc86597d7-fjv2f\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.529750 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.529678 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:25:54.649725 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.649698 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dc86597d7-fjv2f"] Apr 22 16:25:54.652768 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:25:54.652741 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33f2f57d_06aa_4863_b7bb_4014e9e4a82a.slice/crio-0886e40b067d652f815e9b6cbbc8f2d157599c5ec98fe60ad7b18aa0a0278c60 WatchSource:0}: Error finding container 0886e40b067d652f815e9b6cbbc8f2d157599c5ec98fe60ad7b18aa0a0278c60: Status 404 returned error can't find the container with id 0886e40b067d652f815e9b6cbbc8f2d157599c5ec98fe60ad7b18aa0a0278c60 Apr 22 16:25:54.660025 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.659973 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5wqw7" event={"ID":"09f37d35-30d1-4fc0-a88f-3514e6c16586","Type":"ContainerStarted","Data":"7dbaf41202ede79858e094e2512939778eaa34e41859b233192475df5d53fc76"} Apr 22 16:25:54.660025 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.660009 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5wqw7" event={"ID":"09f37d35-30d1-4fc0-a88f-3514e6c16586","Type":"ContainerStarted","Data":"e0a4008bb933558a6929a11cb3a5863ecb43c2941f4c0a7680b7835d3345daa9"} Apr 22 16:25:54.661069 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.661033 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dc86597d7-fjv2f" event={"ID":"33f2f57d-06aa-4863-b7bb-4014e9e4a82a","Type":"ContainerStarted","Data":"0886e40b067d652f815e9b6cbbc8f2d157599c5ec98fe60ad7b18aa0a0278c60"} Apr 22 16:25:54.675343 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:54.675300 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5wqw7" podStartSLOduration=253.707807328 podStartE2EDuration="4m14.675267957s" podCreationTimestamp="2026-04-22 16:21:40 +0000 UTC" firstStartedPulling="2026-04-22 16:25:52.83522406 +0000 UTC m=+252.537242151" lastFinishedPulling="2026-04-22 16:25:53.802684689 +0000 UTC m=+253.504702780" observedRunningTime="2026-04-22 16:25:54.674894973 +0000 UTC m=+254.376913083" watchObservedRunningTime="2026-04-22 16:25:54.675267957 +0000 UTC m=+254.377286065" Apr 22 16:25:55.665904 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:55.665861 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dc86597d7-fjv2f" event={"ID":"33f2f57d-06aa-4863-b7bb-4014e9e4a82a","Type":"ContainerStarted","Data":"29f0e0493e7ae046c069b943c4303703717fcf61e702a670934c48feadd0da3a"} Apr 22 16:25:55.694266 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:25:55.694218 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-dc86597d7-fjv2f" podStartSLOduration=1.694202057 podStartE2EDuration="1.694202057s" podCreationTimestamp="2026-04-22 16:25:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:25:55.692068806 +0000 UTC m=+255.394086911" watchObservedRunningTime="2026-04-22 16:25:55.694202057 +0000 UTC m=+255.396220165" Apr 22 16:26:04.530265 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:04.530170 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:26:04.530265 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:04.530225 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:26:04.534815 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:04.534790 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:26:04.702297 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:04.702267 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:26:04.753343 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:04.753313 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6ffd596d5b-dllvs"] Apr 22 16:26:29.772394 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:29.772334 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6ffd596d5b-dllvs" podUID="c28eebd3-109e-4aed-9da2-953e0b8fb062" containerName="console" containerID="cri-o://ff06c2adb3afa1e2468a1d17d33ed787b633c3d62dd0cfee26de98c01dbf8f87" gracePeriod=15 Apr 22 16:26:30.006783 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.006762 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6ffd596d5b-dllvs_c28eebd3-109e-4aed-9da2-953e0b8fb062/console/0.log" Apr 22 16:26:30.006897 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.006821 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:26:30.125684 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.125600 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-oauth-config\") pod \"c28eebd3-109e-4aed-9da2-953e0b8fb062\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " Apr 22 16:26:30.125684 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.125672 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht6rl\" (UniqueName: \"kubernetes.io/projected/c28eebd3-109e-4aed-9da2-953e0b8fb062-kube-api-access-ht6rl\") pod \"c28eebd3-109e-4aed-9da2-953e0b8fb062\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " Apr 22 16:26:30.125880 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.125699 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-trusted-ca-bundle\") pod \"c28eebd3-109e-4aed-9da2-953e0b8fb062\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " Apr 22 16:26:30.125880 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.125806 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-config\") pod \"c28eebd3-109e-4aed-9da2-953e0b8fb062\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " Apr 22 16:26:30.125880 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.125849 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-oauth-serving-cert\") pod \"c28eebd3-109e-4aed-9da2-953e0b8fb062\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " Apr 22 16:26:30.126029 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.125881 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-serving-cert\") pod \"c28eebd3-109e-4aed-9da2-953e0b8fb062\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " Apr 22 16:26:30.126029 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.125907 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-service-ca\") pod \"c28eebd3-109e-4aed-9da2-953e0b8fb062\" (UID: \"c28eebd3-109e-4aed-9da2-953e0b8fb062\") " Apr 22 16:26:30.126206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.126179 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-config" (OuterVolumeSpecName: "console-config") pod "c28eebd3-109e-4aed-9da2-953e0b8fb062" (UID: "c28eebd3-109e-4aed-9da2-953e0b8fb062"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:26:30.126206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.126194 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c28eebd3-109e-4aed-9da2-953e0b8fb062" (UID: "c28eebd3-109e-4aed-9da2-953e0b8fb062"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:26:30.126381 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.126341 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c28eebd3-109e-4aed-9da2-953e0b8fb062" (UID: "c28eebd3-109e-4aed-9da2-953e0b8fb062"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:26:30.126487 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.126430 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-service-ca" (OuterVolumeSpecName: "service-ca") pod "c28eebd3-109e-4aed-9da2-953e0b8fb062" (UID: "c28eebd3-109e-4aed-9da2-953e0b8fb062"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:26:30.127884 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.127861 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c28eebd3-109e-4aed-9da2-953e0b8fb062-kube-api-access-ht6rl" (OuterVolumeSpecName: "kube-api-access-ht6rl") pod "c28eebd3-109e-4aed-9da2-953e0b8fb062" (UID: "c28eebd3-109e-4aed-9da2-953e0b8fb062"). InnerVolumeSpecName "kube-api-access-ht6rl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:26:30.128505 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.128480 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c28eebd3-109e-4aed-9da2-953e0b8fb062" (UID: "c28eebd3-109e-4aed-9da2-953e0b8fb062"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:26:30.128586 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.128505 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c28eebd3-109e-4aed-9da2-953e0b8fb062" (UID: "c28eebd3-109e-4aed-9da2-953e0b8fb062"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:26:30.226944 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.226919 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-oauth-config\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:26:30.226944 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.226941 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ht6rl\" (UniqueName: \"kubernetes.io/projected/c28eebd3-109e-4aed-9da2-953e0b8fb062-kube-api-access-ht6rl\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:26:30.227098 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.226950 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-trusted-ca-bundle\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:26:30.227098 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.226959 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-config\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:26:30.227098 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.226967 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-oauth-serving-cert\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:26:30.227098 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.226975 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c28eebd3-109e-4aed-9da2-953e0b8fb062-console-serving-cert\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:26:30.227098 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.226984 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c28eebd3-109e-4aed-9da2-953e0b8fb062-service-ca\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:26:30.774184 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.774158 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6ffd596d5b-dllvs_c28eebd3-109e-4aed-9da2-953e0b8fb062/console/0.log" Apr 22 16:26:30.774571 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.774195 2575 generic.go:358] "Generic (PLEG): container finished" podID="c28eebd3-109e-4aed-9da2-953e0b8fb062" containerID="ff06c2adb3afa1e2468a1d17d33ed787b633c3d62dd0cfee26de98c01dbf8f87" exitCode=2 Apr 22 16:26:30.774571 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.774228 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ffd596d5b-dllvs" event={"ID":"c28eebd3-109e-4aed-9da2-953e0b8fb062","Type":"ContainerDied","Data":"ff06c2adb3afa1e2468a1d17d33ed787b633c3d62dd0cfee26de98c01dbf8f87"} Apr 22 16:26:30.774571 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.774255 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ffd596d5b-dllvs" Apr 22 16:26:30.774571 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.774265 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ffd596d5b-dllvs" event={"ID":"c28eebd3-109e-4aed-9da2-953e0b8fb062","Type":"ContainerDied","Data":"8a9f8a5a67cd78565ae9a48caaf6ac928a7f6a76e06c055baa248bea4cd9d95a"} Apr 22 16:26:30.774571 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.774284 2575 scope.go:117] "RemoveContainer" containerID="ff06c2adb3afa1e2468a1d17d33ed787b633c3d62dd0cfee26de98c01dbf8f87" Apr 22 16:26:30.782774 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.782737 2575 scope.go:117] "RemoveContainer" containerID="ff06c2adb3afa1e2468a1d17d33ed787b633c3d62dd0cfee26de98c01dbf8f87" Apr 22 16:26:30.782991 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:26:30.782971 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff06c2adb3afa1e2468a1d17d33ed787b633c3d62dd0cfee26de98c01dbf8f87\": container with ID starting with ff06c2adb3afa1e2468a1d17d33ed787b633c3d62dd0cfee26de98c01dbf8f87 not found: ID does not exist" containerID="ff06c2adb3afa1e2468a1d17d33ed787b633c3d62dd0cfee26de98c01dbf8f87" Apr 22 16:26:30.783058 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.782999 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff06c2adb3afa1e2468a1d17d33ed787b633c3d62dd0cfee26de98c01dbf8f87"} err="failed to get container status \"ff06c2adb3afa1e2468a1d17d33ed787b633c3d62dd0cfee26de98c01dbf8f87\": rpc error: code = NotFound desc = could not find container \"ff06c2adb3afa1e2468a1d17d33ed787b633c3d62dd0cfee26de98c01dbf8f87\": container with ID starting with ff06c2adb3afa1e2468a1d17d33ed787b633c3d62dd0cfee26de98c01dbf8f87 not found: ID does not exist" Apr 22 16:26:30.794676 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.794657 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6ffd596d5b-dllvs"] Apr 22 16:26:30.800416 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.800397 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6ffd596d5b-dllvs"] Apr 22 16:26:30.904339 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:30.904307 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c28eebd3-109e-4aed-9da2-953e0b8fb062" path="/var/lib/kubelet/pods/c28eebd3-109e-4aed-9da2-953e0b8fb062/volumes" Apr 22 16:26:40.769480 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:40.769453 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:26:40.771669 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:40.771649 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:26:40.772625 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:26:40.772587 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 16:27:01.153634 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.153592 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55c86db4d7-gtq4w"] Apr 22 16:27:01.156057 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.153966 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c28eebd3-109e-4aed-9da2-953e0b8fb062" containerName="console" Apr 22 16:27:01.156057 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.153980 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28eebd3-109e-4aed-9da2-953e0b8fb062" containerName="console" Apr 22 16:27:01.156057 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.154069 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c28eebd3-109e-4aed-9da2-953e0b8fb062" containerName="console" Apr 22 16:27:01.156860 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.156838 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.168955 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.168934 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55c86db4d7-gtq4w"] Apr 22 16:27:01.174571 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.174550 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-645br\" (UniqueName: \"kubernetes.io/projected/725847a7-7a70-400a-b864-7422a85f62b0-kube-api-access-645br\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.174677 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.174583 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/725847a7-7a70-400a-b864-7422a85f62b0-console-serving-cert\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.174677 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.174614 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-console-config\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.174758 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.174716 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-service-ca\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.174804 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.174765 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-trusted-ca-bundle\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.174847 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.174818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/725847a7-7a70-400a-b864-7422a85f62b0-console-oauth-config\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.174881 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.174861 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-oauth-serving-cert\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.275703 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.275666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-service-ca\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.275882 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.275712 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-trusted-ca-bundle\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.275882 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.275752 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/725847a7-7a70-400a-b864-7422a85f62b0-console-oauth-config\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.275882 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.275772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-oauth-serving-cert\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.275882 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.275801 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-645br\" (UniqueName: \"kubernetes.io/projected/725847a7-7a70-400a-b864-7422a85f62b0-kube-api-access-645br\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.275882 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.275836 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/725847a7-7a70-400a-b864-7422a85f62b0-console-serving-cert\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.275882 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.275879 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-console-config\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.276484 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.276454 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-service-ca\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.276647 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.276587 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-oauth-serving-cert\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.276716 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.276632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-trusted-ca-bundle\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.276882 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.276861 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-console-config\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.278235 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.278213 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/725847a7-7a70-400a-b864-7422a85f62b0-console-oauth-config\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.278437 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.278418 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/725847a7-7a70-400a-b864-7422a85f62b0-console-serving-cert\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.284426 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.284404 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-645br\" (UniqueName: \"kubernetes.io/projected/725847a7-7a70-400a-b864-7422a85f62b0-kube-api-access-645br\") pod \"console-55c86db4d7-gtq4w\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.465821 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.465789 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:01.587492 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.587435 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55c86db4d7-gtq4w"] Apr 22 16:27:01.590352 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:27:01.590324 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod725847a7_7a70_400a_b864_7422a85f62b0.slice/crio-65bab558d08c29e445ec870d6a879e23c1669a9ddfa9c4c43cfd4184d848f97e WatchSource:0}: Error finding container 65bab558d08c29e445ec870d6a879e23c1669a9ddfa9c4c43cfd4184d848f97e: Status 404 returned error can't find the container with id 65bab558d08c29e445ec870d6a879e23c1669a9ddfa9c4c43cfd4184d848f97e Apr 22 16:27:01.592098 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.592081 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:27:01.876800 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.876706 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55c86db4d7-gtq4w" event={"ID":"725847a7-7a70-400a-b864-7422a85f62b0","Type":"ContainerStarted","Data":"e0dd683e5496030d6c7ebafa7611b4d46eb8de1caa2948b550356b6fdf92a4e2"} Apr 22 16:27:01.876800 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.876744 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55c86db4d7-gtq4w" event={"ID":"725847a7-7a70-400a-b864-7422a85f62b0","Type":"ContainerStarted","Data":"65bab558d08c29e445ec870d6a879e23c1669a9ddfa9c4c43cfd4184d848f97e"} Apr 22 16:27:01.897992 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:01.897934 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55c86db4d7-gtq4w" podStartSLOduration=0.897920654 podStartE2EDuration="897.920654ms" podCreationTimestamp="2026-04-22 16:27:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:27:01.897664638 +0000 UTC m=+321.599682746" watchObservedRunningTime="2026-04-22 16:27:01.897920654 +0000 UTC m=+321.599938763" Apr 22 16:27:11.466238 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:11.466192 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:11.466238 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:11.466242 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:11.471147 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:11.471120 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:11.910393 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:11.910315 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:27:11.954237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:11.954208 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dc86597d7-fjv2f"] Apr 22 16:27:31.957032 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:31.956939 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-mxnhl"] Apr 22 16:27:31.960240 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:31.960224 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mxnhl" Apr 22 16:27:31.962604 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:31.962582 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 16:27:31.974793 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:31.974769 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mxnhl"] Apr 22 16:27:32.024795 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:32.024761 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2a362d49-91bf-4ec1-b686-5a8676288536-dbus\") pod \"global-pull-secret-syncer-mxnhl\" (UID: \"2a362d49-91bf-4ec1-b686-5a8676288536\") " pod="kube-system/global-pull-secret-syncer-mxnhl" Apr 22 16:27:32.024940 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:32.024897 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2a362d49-91bf-4ec1-b686-5a8676288536-kubelet-config\") pod \"global-pull-secret-syncer-mxnhl\" (UID: \"2a362d49-91bf-4ec1-b686-5a8676288536\") " pod="kube-system/global-pull-secret-syncer-mxnhl" Apr 22 16:27:32.024984 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:32.024951 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2a362d49-91bf-4ec1-b686-5a8676288536-original-pull-secret\") pod \"global-pull-secret-syncer-mxnhl\" (UID: \"2a362d49-91bf-4ec1-b686-5a8676288536\") " pod="kube-system/global-pull-secret-syncer-mxnhl" Apr 22 16:27:32.126016 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:32.125981 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2a362d49-91bf-4ec1-b686-5a8676288536-dbus\") pod \"global-pull-secret-syncer-mxnhl\" (UID: \"2a362d49-91bf-4ec1-b686-5a8676288536\") " pod="kube-system/global-pull-secret-syncer-mxnhl" Apr 22 16:27:32.126185 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:32.126072 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2a362d49-91bf-4ec1-b686-5a8676288536-kubelet-config\") pod \"global-pull-secret-syncer-mxnhl\" (UID: \"2a362d49-91bf-4ec1-b686-5a8676288536\") " pod="kube-system/global-pull-secret-syncer-mxnhl" Apr 22 16:27:32.126185 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:32.126101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2a362d49-91bf-4ec1-b686-5a8676288536-original-pull-secret\") pod \"global-pull-secret-syncer-mxnhl\" (UID: \"2a362d49-91bf-4ec1-b686-5a8676288536\") " pod="kube-system/global-pull-secret-syncer-mxnhl" Apr 22 16:27:32.126185 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:32.126140 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2a362d49-91bf-4ec1-b686-5a8676288536-kubelet-config\") pod \"global-pull-secret-syncer-mxnhl\" (UID: \"2a362d49-91bf-4ec1-b686-5a8676288536\") " pod="kube-system/global-pull-secret-syncer-mxnhl" Apr 22 16:27:32.126320 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:32.126182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2a362d49-91bf-4ec1-b686-5a8676288536-dbus\") pod \"global-pull-secret-syncer-mxnhl\" (UID: \"2a362d49-91bf-4ec1-b686-5a8676288536\") " pod="kube-system/global-pull-secret-syncer-mxnhl" Apr 22 16:27:32.128416 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:32.128396 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2a362d49-91bf-4ec1-b686-5a8676288536-original-pull-secret\") pod \"global-pull-secret-syncer-mxnhl\" (UID: \"2a362d49-91bf-4ec1-b686-5a8676288536\") " pod="kube-system/global-pull-secret-syncer-mxnhl" Apr 22 16:27:32.272528 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:32.272442 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mxnhl" Apr 22 16:27:32.391256 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:32.391176 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mxnhl"] Apr 22 16:27:32.393934 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:27:32.393906 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a362d49_91bf_4ec1_b686_5a8676288536.slice/crio-892453e9c4cbafbe924b787fc71cd398923c9a8bed18d2cb47014316dc121207 WatchSource:0}: Error finding container 892453e9c4cbafbe924b787fc71cd398923c9a8bed18d2cb47014316dc121207: Status 404 returned error can't find the container with id 892453e9c4cbafbe924b787fc71cd398923c9a8bed18d2cb47014316dc121207 Apr 22 16:27:32.967310 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:32.967266 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mxnhl" event={"ID":"2a362d49-91bf-4ec1-b686-5a8676288536","Type":"ContainerStarted","Data":"892453e9c4cbafbe924b787fc71cd398923c9a8bed18d2cb47014316dc121207"} Apr 22 16:27:36.974864 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:36.974823 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-dc86597d7-fjv2f" podUID="33f2f57d-06aa-4863-b7bb-4014e9e4a82a" containerName="console" containerID="cri-o://29f0e0493e7ae046c069b943c4303703717fcf61e702a670934c48feadd0da3a" gracePeriod=15 Apr 22 16:27:37.406239 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.406218 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dc86597d7-fjv2f_33f2f57d-06aa-4863-b7bb-4014e9e4a82a/console/0.log" Apr 22 16:27:37.406382 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.406278 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:27:37.575098 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.574994 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-oauth-config\") pod \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " Apr 22 16:27:37.575098 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.575090 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db8r8\" (UniqueName: \"kubernetes.io/projected/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-kube-api-access-db8r8\") pod \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " Apr 22 16:27:37.575314 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.575121 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-serving-cert\") pod \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " Apr 22 16:27:37.575314 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.575143 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-oauth-serving-cert\") pod \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " Apr 22 16:27:37.575314 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.575213 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-service-ca\") pod \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " Apr 22 16:27:37.575314 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.575242 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-config\") pod \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " Apr 22 16:27:37.575505 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.575315 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-trusted-ca-bundle\") pod \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\" (UID: \"33f2f57d-06aa-4863-b7bb-4014e9e4a82a\") " Apr 22 16:27:37.575594 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.575548 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "33f2f57d-06aa-4863-b7bb-4014e9e4a82a" (UID: "33f2f57d-06aa-4863-b7bb-4014e9e4a82a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:27:37.575594 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.575564 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-service-ca" (OuterVolumeSpecName: "service-ca") pod "33f2f57d-06aa-4863-b7bb-4014e9e4a82a" (UID: "33f2f57d-06aa-4863-b7bb-4014e9e4a82a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:27:37.575838 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.575806 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-config" (OuterVolumeSpecName: "console-config") pod "33f2f57d-06aa-4863-b7bb-4014e9e4a82a" (UID: "33f2f57d-06aa-4863-b7bb-4014e9e4a82a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:27:37.575838 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.575823 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "33f2f57d-06aa-4863-b7bb-4014e9e4a82a" (UID: "33f2f57d-06aa-4863-b7bb-4014e9e4a82a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:27:37.577491 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.577460 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-kube-api-access-db8r8" (OuterVolumeSpecName: "kube-api-access-db8r8") pod "33f2f57d-06aa-4863-b7bb-4014e9e4a82a" (UID: "33f2f57d-06aa-4863-b7bb-4014e9e4a82a"). InnerVolumeSpecName "kube-api-access-db8r8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:27:37.577599 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.577556 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "33f2f57d-06aa-4863-b7bb-4014e9e4a82a" (UID: "33f2f57d-06aa-4863-b7bb-4014e9e4a82a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:27:37.577698 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.577683 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "33f2f57d-06aa-4863-b7bb-4014e9e4a82a" (UID: "33f2f57d-06aa-4863-b7bb-4014e9e4a82a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:27:37.676874 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.676841 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-service-ca\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:27:37.676874 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.676868 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-config\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:27:37.676874 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.676876 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-trusted-ca-bundle\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:27:37.677105 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.676885 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-oauth-config\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:27:37.677105 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.676894 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-db8r8\" (UniqueName: \"kubernetes.io/projected/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-kube-api-access-db8r8\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:27:37.677105 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.676903 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-console-serving-cert\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:27:37.677105 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.676911 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33f2f57d-06aa-4863-b7bb-4014e9e4a82a-oauth-serving-cert\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:27:37.991279 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.991229 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mxnhl" event={"ID":"2a362d49-91bf-4ec1-b686-5a8676288536","Type":"ContainerStarted","Data":"86f73f397a07077c537a9b60e6fd29fa96065eb25d2c745d55fe047405a31c57"} Apr 22 16:27:37.992425 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.992407 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dc86597d7-fjv2f_33f2f57d-06aa-4863-b7bb-4014e9e4a82a/console/0.log" Apr 22 16:27:37.992528 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.992439 2575 generic.go:358] "Generic (PLEG): container finished" podID="33f2f57d-06aa-4863-b7bb-4014e9e4a82a" containerID="29f0e0493e7ae046c069b943c4303703717fcf61e702a670934c48feadd0da3a" exitCode=2 Apr 22 16:27:37.992528 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.992465 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dc86597d7-fjv2f" event={"ID":"33f2f57d-06aa-4863-b7bb-4014e9e4a82a","Type":"ContainerDied","Data":"29f0e0493e7ae046c069b943c4303703717fcf61e702a670934c48feadd0da3a"} Apr 22 16:27:37.992528 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.992496 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dc86597d7-fjv2f" event={"ID":"33f2f57d-06aa-4863-b7bb-4014e9e4a82a","Type":"ContainerDied","Data":"0886e40b067d652f815e9b6cbbc8f2d157599c5ec98fe60ad7b18aa0a0278c60"} Apr 22 16:27:37.992528 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.992502 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dc86597d7-fjv2f" Apr 22 16:27:37.992528 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:37.992510 2575 scope.go:117] "RemoveContainer" containerID="29f0e0493e7ae046c069b943c4303703717fcf61e702a670934c48feadd0da3a" Apr 22 16:27:38.000991 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:38.000975 2575 scope.go:117] "RemoveContainer" containerID="29f0e0493e7ae046c069b943c4303703717fcf61e702a670934c48feadd0da3a" Apr 22 16:27:38.001270 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:27:38.001253 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29f0e0493e7ae046c069b943c4303703717fcf61e702a670934c48feadd0da3a\": container with ID starting with 29f0e0493e7ae046c069b943c4303703717fcf61e702a670934c48feadd0da3a not found: ID does not exist" containerID="29f0e0493e7ae046c069b943c4303703717fcf61e702a670934c48feadd0da3a" Apr 22 16:27:38.001321 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:38.001278 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29f0e0493e7ae046c069b943c4303703717fcf61e702a670934c48feadd0da3a"} err="failed to get container status \"29f0e0493e7ae046c069b943c4303703717fcf61e702a670934c48feadd0da3a\": rpc error: code = NotFound desc = could not find container \"29f0e0493e7ae046c069b943c4303703717fcf61e702a670934c48feadd0da3a\": container with ID starting with 29f0e0493e7ae046c069b943c4303703717fcf61e702a670934c48feadd0da3a not found: ID does not exist" Apr 22 16:27:38.006447 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:38.006409 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-mxnhl" podStartSLOduration=2.143749998 podStartE2EDuration="7.006398564s" podCreationTimestamp="2026-04-22 16:27:31 +0000 UTC" firstStartedPulling="2026-04-22 16:27:32.395517325 +0000 UTC m=+352.097535413" lastFinishedPulling="2026-04-22 16:27:37.258165885 +0000 UTC m=+356.960183979" observedRunningTime="2026-04-22 16:27:38.005132946 +0000 UTC m=+357.707151057" watchObservedRunningTime="2026-04-22 16:27:38.006398564 +0000 UTC m=+357.708416671" Apr 22 16:27:38.021592 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:38.021569 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dc86597d7-fjv2f"] Apr 22 16:27:38.024608 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:38.024584 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-dc86597d7-fjv2f"] Apr 22 16:27:38.904753 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:27:38.904722 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f2f57d-06aa-4863-b7bb-4014e9e4a82a" path="/var/lib/kubelet/pods/33f2f57d-06aa-4863-b7bb-4014e9e4a82a/volumes" Apr 22 16:31:12.365815 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.365779 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-884fb454c-25glw"] Apr 22 16:31:12.366292 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.366129 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33f2f57d-06aa-4863-b7bb-4014e9e4a82a" containerName="console" Apr 22 16:31:12.366292 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.366141 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f2f57d-06aa-4863-b7bb-4014e9e4a82a" containerName="console" Apr 22 16:31:12.366292 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.366198 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="33f2f57d-06aa-4863-b7bb-4014e9e4a82a" containerName="console" Apr 22 16:31:12.368849 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.368835 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.403574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.403543 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-oauth-serving-cert\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.403698 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.403595 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-console-oauth-config\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.403698 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.403623 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-console-config\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.403698 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.403661 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-console-serving-cert\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.403810 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.403747 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvtlv\" (UniqueName: \"kubernetes.io/projected/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-kube-api-access-lvtlv\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.403810 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.403777 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-trusted-ca-bundle\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.403810 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.403799 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-service-ca\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.457580 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.457544 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-884fb454c-25glw"] Apr 22 16:31:12.504977 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.504943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvtlv\" (UniqueName: \"kubernetes.io/projected/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-kube-api-access-lvtlv\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.504977 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.504978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-trusted-ca-bundle\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.505232 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.505011 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-service-ca\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.505232 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.505072 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-oauth-serving-cert\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.505232 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.505119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-console-oauth-config\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.505232 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.505147 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-console-config\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.505232 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.505185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-console-serving-cert\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.505852 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.505791 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-console-config\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.506126 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.505944 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-oauth-serving-cert\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.506126 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.505990 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-trusted-ca-bundle\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.506126 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.506035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-service-ca\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.507678 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.507651 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-console-oauth-config\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.507785 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.507758 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-console-serving-cert\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.514562 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.514541 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvtlv\" (UniqueName: \"kubernetes.io/projected/d2ec5175-61f0-4b9a-b8ae-64af4003cbf9-kube-api-access-lvtlv\") pod \"console-884fb454c-25glw\" (UID: \"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9\") " pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.678127 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.678024 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:12.810227 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:12.810197 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-884fb454c-25glw"] Apr 22 16:31:12.813222 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:31:12.813191 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2ec5175_61f0_4b9a_b8ae_64af4003cbf9.slice/crio-3e6b91093200f3058f9a746400d3bd85e7e910723b5ee06051c71a6a13afeab0 WatchSource:0}: Error finding container 3e6b91093200f3058f9a746400d3bd85e7e910723b5ee06051c71a6a13afeab0: Status 404 returned error can't find the container with id 3e6b91093200f3058f9a746400d3bd85e7e910723b5ee06051c71a6a13afeab0 Apr 22 16:31:13.617636 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:13.617595 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-884fb454c-25glw" event={"ID":"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9","Type":"ContainerStarted","Data":"dc73cbeba9a0e986823b58a1fd70a2c9533bfbc351848049ffbb3dd97f652387"} Apr 22 16:31:13.617636 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:13.617631 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-884fb454c-25glw" event={"ID":"d2ec5175-61f0-4b9a-b8ae-64af4003cbf9","Type":"ContainerStarted","Data":"3e6b91093200f3058f9a746400d3bd85e7e910723b5ee06051c71a6a13afeab0"} Apr 22 16:31:13.642436 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:13.642385 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-884fb454c-25glw" podStartSLOduration=1.642370944 podStartE2EDuration="1.642370944s" podCreationTimestamp="2026-04-22 16:31:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:31:13.640308645 +0000 UTC m=+573.342326754" watchObservedRunningTime="2026-04-22 16:31:13.642370944 +0000 UTC m=+573.344389053" Apr 22 16:31:22.679202 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:22.679165 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:22.679671 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:22.679433 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:22.683580 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:22.683561 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:23.652058 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:23.652016 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-884fb454c-25glw" Apr 22 16:31:23.709169 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:23.709133 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55c86db4d7-gtq4w"] Apr 22 16:31:40.793939 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:40.793909 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:31:40.794423 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:40.794169 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:31:48.740397 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:48.740362 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-55c86db4d7-gtq4w" podUID="725847a7-7a70-400a-b864-7422a85f62b0" containerName="console" containerID="cri-o://e0dd683e5496030d6c7ebafa7611b4d46eb8de1caa2948b550356b6fdf92a4e2" gracePeriod=15 Apr 22 16:31:48.972350 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:48.972322 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55c86db4d7-gtq4w_725847a7-7a70-400a-b864-7422a85f62b0/console/0.log" Apr 22 16:31:48.972455 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:48.972389 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:31:49.021538 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.021464 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-oauth-serving-cert\") pod \"725847a7-7a70-400a-b864-7422a85f62b0\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " Apr 22 16:31:49.021538 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.021506 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-645br\" (UniqueName: \"kubernetes.io/projected/725847a7-7a70-400a-b864-7422a85f62b0-kube-api-access-645br\") pod \"725847a7-7a70-400a-b864-7422a85f62b0\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " Apr 22 16:31:49.021735 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.021633 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-console-config\") pod \"725847a7-7a70-400a-b864-7422a85f62b0\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " Apr 22 16:31:49.021735 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.021700 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/725847a7-7a70-400a-b864-7422a85f62b0-console-oauth-config\") pod \"725847a7-7a70-400a-b864-7422a85f62b0\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " Apr 22 16:31:49.021834 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.021742 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-service-ca\") pod \"725847a7-7a70-400a-b864-7422a85f62b0\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " Apr 22 16:31:49.021834 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.021776 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-trusted-ca-bundle\") pod \"725847a7-7a70-400a-b864-7422a85f62b0\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " Apr 22 16:31:49.021834 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.021808 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/725847a7-7a70-400a-b864-7422a85f62b0-console-serving-cert\") pod \"725847a7-7a70-400a-b864-7422a85f62b0\" (UID: \"725847a7-7a70-400a-b864-7422a85f62b0\") " Apr 22 16:31:49.021990 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.021883 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "725847a7-7a70-400a-b864-7422a85f62b0" (UID: "725847a7-7a70-400a-b864-7422a85f62b0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:31:49.022134 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.022083 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-console-config" (OuterVolumeSpecName: "console-config") pod "725847a7-7a70-400a-b864-7422a85f62b0" (UID: "725847a7-7a70-400a-b864-7422a85f62b0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:31:49.022134 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.022103 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-service-ca" (OuterVolumeSpecName: "service-ca") pod "725847a7-7a70-400a-b864-7422a85f62b0" (UID: "725847a7-7a70-400a-b864-7422a85f62b0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:31:49.022311 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.022199 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "725847a7-7a70-400a-b864-7422a85f62b0" (UID: "725847a7-7a70-400a-b864-7422a85f62b0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:31:49.022311 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.022219 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-console-config\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:31:49.022311 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.022237 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-service-ca\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:31:49.022311 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.022251 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-oauth-serving-cert\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:31:49.024098 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.024068 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/725847a7-7a70-400a-b864-7422a85f62b0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "725847a7-7a70-400a-b864-7422a85f62b0" (UID: "725847a7-7a70-400a-b864-7422a85f62b0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:31:49.024432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.024409 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/725847a7-7a70-400a-b864-7422a85f62b0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "725847a7-7a70-400a-b864-7422a85f62b0" (UID: "725847a7-7a70-400a-b864-7422a85f62b0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:31:49.024502 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.024443 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/725847a7-7a70-400a-b864-7422a85f62b0-kube-api-access-645br" (OuterVolumeSpecName: "kube-api-access-645br") pod "725847a7-7a70-400a-b864-7422a85f62b0" (UID: "725847a7-7a70-400a-b864-7422a85f62b0"). InnerVolumeSpecName "kube-api-access-645br". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:31:49.122671 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.122635 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/725847a7-7a70-400a-b864-7422a85f62b0-console-serving-cert\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:31:49.122671 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.122666 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-645br\" (UniqueName: \"kubernetes.io/projected/725847a7-7a70-400a-b864-7422a85f62b0-kube-api-access-645br\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:31:49.122671 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.122677 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/725847a7-7a70-400a-b864-7422a85f62b0-console-oauth-config\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:31:49.122883 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.122685 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/725847a7-7a70-400a-b864-7422a85f62b0-trusted-ca-bundle\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:31:49.725115 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.725091 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55c86db4d7-gtq4w_725847a7-7a70-400a-b864-7422a85f62b0/console/0.log" Apr 22 16:31:49.725313 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.725128 2575 generic.go:358] "Generic (PLEG): container finished" podID="725847a7-7a70-400a-b864-7422a85f62b0" containerID="e0dd683e5496030d6c7ebafa7611b4d46eb8de1caa2948b550356b6fdf92a4e2" exitCode=2 Apr 22 16:31:49.725313 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.725168 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55c86db4d7-gtq4w" event={"ID":"725847a7-7a70-400a-b864-7422a85f62b0","Type":"ContainerDied","Data":"e0dd683e5496030d6c7ebafa7611b4d46eb8de1caa2948b550356b6fdf92a4e2"} Apr 22 16:31:49.725313 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.725204 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55c86db4d7-gtq4w" Apr 22 16:31:49.725313 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.725220 2575 scope.go:117] "RemoveContainer" containerID="e0dd683e5496030d6c7ebafa7611b4d46eb8de1caa2948b550356b6fdf92a4e2" Apr 22 16:31:49.725313 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.725210 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55c86db4d7-gtq4w" event={"ID":"725847a7-7a70-400a-b864-7422a85f62b0","Type":"ContainerDied","Data":"65bab558d08c29e445ec870d6a879e23c1669a9ddfa9c4c43cfd4184d848f97e"} Apr 22 16:31:49.733694 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.733678 2575 scope.go:117] "RemoveContainer" containerID="e0dd683e5496030d6c7ebafa7611b4d46eb8de1caa2948b550356b6fdf92a4e2" Apr 22 16:31:49.733946 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:31:49.733926 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0dd683e5496030d6c7ebafa7611b4d46eb8de1caa2948b550356b6fdf92a4e2\": container with ID starting with e0dd683e5496030d6c7ebafa7611b4d46eb8de1caa2948b550356b6fdf92a4e2 not found: ID does not exist" containerID="e0dd683e5496030d6c7ebafa7611b4d46eb8de1caa2948b550356b6fdf92a4e2" Apr 22 16:31:49.734032 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.733950 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0dd683e5496030d6c7ebafa7611b4d46eb8de1caa2948b550356b6fdf92a4e2"} err="failed to get container status \"e0dd683e5496030d6c7ebafa7611b4d46eb8de1caa2948b550356b6fdf92a4e2\": rpc error: code = NotFound desc = could not find container \"e0dd683e5496030d6c7ebafa7611b4d46eb8de1caa2948b550356b6fdf92a4e2\": container with ID starting with e0dd683e5496030d6c7ebafa7611b4d46eb8de1caa2948b550356b6fdf92a4e2 not found: ID does not exist" Apr 22 16:31:49.746569 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.746527 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55c86db4d7-gtq4w"] Apr 22 16:31:49.751143 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:49.751120 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-55c86db4d7-gtq4w"] Apr 22 16:31:50.905575 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:31:50.905542 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="725847a7-7a70-400a-b864-7422a85f62b0" path="/var/lib/kubelet/pods/725847a7-7a70-400a-b864-7422a85f62b0/volumes" Apr 22 16:36:40.817116 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:36:40.817026 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:36:40.818551 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:36:40.818531 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:41:40.838248 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:41:40.838222 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:41:40.840499 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:41:40.840057 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:46:40.860242 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:46:40.860122 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:46:40.864123 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:46:40.862475 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:48:46.295834 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.295760 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt"] Apr 22 16:48:46.296281 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.296161 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="725847a7-7a70-400a-b864-7422a85f62b0" containerName="console" Apr 22 16:48:46.296281 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.296174 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="725847a7-7a70-400a-b864-7422a85f62b0" containerName="console" Apr 22 16:48:46.296281 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.296232 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="725847a7-7a70-400a-b864-7422a85f62b0" containerName="console" Apr 22 16:48:46.299323 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.299307 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" Apr 22 16:48:46.301804 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.301781 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lwknc\"" Apr 22 16:48:46.301927 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.301781 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 16:48:46.302672 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.302657 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 16:48:46.321710 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.321689 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt"] Apr 22 16:48:46.447403 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.447372 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c39e881-c786-4ec3-9c88-3d4045bc985c-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt\" (UID: \"3c39e881-c786-4ec3-9c88-3d4045bc985c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" Apr 22 16:48:46.447562 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.447444 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nndz\" (UniqueName: \"kubernetes.io/projected/3c39e881-c786-4ec3-9c88-3d4045bc985c-kube-api-access-8nndz\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt\" (UID: \"3c39e881-c786-4ec3-9c88-3d4045bc985c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" Apr 22 16:48:46.447562 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.447466 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c39e881-c786-4ec3-9c88-3d4045bc985c-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt\" (UID: \"3c39e881-c786-4ec3-9c88-3d4045bc985c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" Apr 22 16:48:46.548326 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.548238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c39e881-c786-4ec3-9c88-3d4045bc985c-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt\" (UID: \"3c39e881-c786-4ec3-9c88-3d4045bc985c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" Apr 22 16:48:46.548326 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.548306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c39e881-c786-4ec3-9c88-3d4045bc985c-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt\" (UID: \"3c39e881-c786-4ec3-9c88-3d4045bc985c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" Apr 22 16:48:46.548539 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.548336 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nndz\" (UniqueName: \"kubernetes.io/projected/3c39e881-c786-4ec3-9c88-3d4045bc985c-kube-api-access-8nndz\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt\" (UID: \"3c39e881-c786-4ec3-9c88-3d4045bc985c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" Apr 22 16:48:46.548676 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.548656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c39e881-c786-4ec3-9c88-3d4045bc985c-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt\" (UID: \"3c39e881-c786-4ec3-9c88-3d4045bc985c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" Apr 22 16:48:46.548734 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.548703 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c39e881-c786-4ec3-9c88-3d4045bc985c-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt\" (UID: \"3c39e881-c786-4ec3-9c88-3d4045bc985c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" Apr 22 16:48:46.557217 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.557190 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nndz\" (UniqueName: \"kubernetes.io/projected/3c39e881-c786-4ec3-9c88-3d4045bc985c-kube-api-access-8nndz\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt\" (UID: \"3c39e881-c786-4ec3-9c88-3d4045bc985c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" Apr 22 16:48:46.608088 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.608065 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" Apr 22 16:48:46.728136 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.728111 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt"] Apr 22 16:48:46.730127 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:48:46.730098 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c39e881_c786_4ec3_9c88_3d4045bc985c.slice/crio-c42bbd95bf0f47818742e6c619997cfb03e3e5b9da3cf40ddf45894fee34799d WatchSource:0}: Error finding container c42bbd95bf0f47818742e6c619997cfb03e3e5b9da3cf40ddf45894fee34799d: Status 404 returned error can't find the container with id c42bbd95bf0f47818742e6c619997cfb03e3e5b9da3cf40ddf45894fee34799d Apr 22 16:48:46.731941 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:46.731916 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:48:47.694054 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:47.693988 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" event={"ID":"3c39e881-c786-4ec3-9c88-3d4045bc985c","Type":"ContainerStarted","Data":"c42bbd95bf0f47818742e6c619997cfb03e3e5b9da3cf40ddf45894fee34799d"} Apr 22 16:48:53.716059 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:53.716008 2575 generic.go:358] "Generic (PLEG): container finished" podID="3c39e881-c786-4ec3-9c88-3d4045bc985c" containerID="4a1f43e6cb0d08eb6c0c7718d7e0db5ffb13a0eb8a1bbbcb6bd2ca1fbf37453a" exitCode=0 Apr 22 16:48:53.716454 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:53.716092 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" event={"ID":"3c39e881-c786-4ec3-9c88-3d4045bc985c","Type":"ContainerDied","Data":"4a1f43e6cb0d08eb6c0c7718d7e0db5ffb13a0eb8a1bbbcb6bd2ca1fbf37453a"} Apr 22 16:48:56.728059 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:56.728011 2575 generic.go:358] "Generic (PLEG): container finished" podID="3c39e881-c786-4ec3-9c88-3d4045bc985c" containerID="b04661d40ab01fdea2a4ee2dc2b10dbea016db14268361df938f1f79f5f7dd25" exitCode=0 Apr 22 16:48:56.728416 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:48:56.728077 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" event={"ID":"3c39e881-c786-4ec3-9c88-3d4045bc985c","Type":"ContainerDied","Data":"b04661d40ab01fdea2a4ee2dc2b10dbea016db14268361df938f1f79f5f7dd25"} Apr 22 16:49:03.754290 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:03.754249 2575 generic.go:358] "Generic (PLEG): container finished" podID="3c39e881-c786-4ec3-9c88-3d4045bc985c" containerID="6b40ce93876c4e0e16411d45ef671dd0dae57388d873dc8ea43d9de7625a31ed" exitCode=0 Apr 22 16:49:03.754770 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:03.754301 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" event={"ID":"3c39e881-c786-4ec3-9c88-3d4045bc985c","Type":"ContainerDied","Data":"6b40ce93876c4e0e16411d45ef671dd0dae57388d873dc8ea43d9de7625a31ed"} Apr 22 16:49:04.872775 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:04.872751 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" Apr 22 16:49:04.917751 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:04.917724 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c39e881-c786-4ec3-9c88-3d4045bc985c-bundle\") pod \"3c39e881-c786-4ec3-9c88-3d4045bc985c\" (UID: \"3c39e881-c786-4ec3-9c88-3d4045bc985c\") " Apr 22 16:49:04.917751 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:04.917755 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nndz\" (UniqueName: \"kubernetes.io/projected/3c39e881-c786-4ec3-9c88-3d4045bc985c-kube-api-access-8nndz\") pod \"3c39e881-c786-4ec3-9c88-3d4045bc985c\" (UID: \"3c39e881-c786-4ec3-9c88-3d4045bc985c\") " Apr 22 16:49:04.917980 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:04.917776 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c39e881-c786-4ec3-9c88-3d4045bc985c-util\") pod \"3c39e881-c786-4ec3-9c88-3d4045bc985c\" (UID: \"3c39e881-c786-4ec3-9c88-3d4045bc985c\") " Apr 22 16:49:04.918401 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:04.918359 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c39e881-c786-4ec3-9c88-3d4045bc985c-bundle" (OuterVolumeSpecName: "bundle") pod "3c39e881-c786-4ec3-9c88-3d4045bc985c" (UID: "3c39e881-c786-4ec3-9c88-3d4045bc985c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:49:04.920002 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:04.919975 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c39e881-c786-4ec3-9c88-3d4045bc985c-kube-api-access-8nndz" (OuterVolumeSpecName: "kube-api-access-8nndz") pod "3c39e881-c786-4ec3-9c88-3d4045bc985c" (UID: "3c39e881-c786-4ec3-9c88-3d4045bc985c"). InnerVolumeSpecName "kube-api-access-8nndz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:49:04.922925 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:04.922895 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c39e881-c786-4ec3-9c88-3d4045bc985c-util" (OuterVolumeSpecName: "util") pod "3c39e881-c786-4ec3-9c88-3d4045bc985c" (UID: "3c39e881-c786-4ec3-9c88-3d4045bc985c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:49:05.018597 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:05.018524 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c39e881-c786-4ec3-9c88-3d4045bc985c-bundle\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:49:05.018597 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:05.018548 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8nndz\" (UniqueName: \"kubernetes.io/projected/3c39e881-c786-4ec3-9c88-3d4045bc985c-kube-api-access-8nndz\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:49:05.018597 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:05.018557 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c39e881-c786-4ec3-9c88-3d4045bc985c-util\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:49:05.762101 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:05.762067 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" event={"ID":"3c39e881-c786-4ec3-9c88-3d4045bc985c","Type":"ContainerDied","Data":"c42bbd95bf0f47818742e6c619997cfb03e3e5b9da3cf40ddf45894fee34799d"} Apr 22 16:49:05.762101 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:05.762104 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c42bbd95bf0f47818742e6c619997cfb03e3e5b9da3cf40ddf45894fee34799d" Apr 22 16:49:05.762300 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:05.762131 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dkzsqt" Apr 22 16:49:09.376835 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.376804 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rkf8c"] Apr 22 16:49:09.377323 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.377281 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c39e881-c786-4ec3-9c88-3d4045bc985c" containerName="pull" Apr 22 16:49:09.377323 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.377299 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c39e881-c786-4ec3-9c88-3d4045bc985c" containerName="pull" Apr 22 16:49:09.377323 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.377313 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c39e881-c786-4ec3-9c88-3d4045bc985c" containerName="extract" Apr 22 16:49:09.377323 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.377321 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c39e881-c786-4ec3-9c88-3d4045bc985c" containerName="extract" Apr 22 16:49:09.377566 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.377356 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c39e881-c786-4ec3-9c88-3d4045bc985c" containerName="util" Apr 22 16:49:09.377566 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.377365 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c39e881-c786-4ec3-9c88-3d4045bc985c" containerName="util" Apr 22 16:49:09.377566 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.377443 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c39e881-c786-4ec3-9c88-3d4045bc985c" containerName="extract" Apr 22 16:49:09.380396 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.380375 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rkf8c" Apr 22 16:49:09.383383 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.383360 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:49:09.383550 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.383440 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 22 16:49:09.383607 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.383560 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-6kjvn\"" Apr 22 16:49:09.393058 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.393014 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rkf8c"] Apr 22 16:49:09.446029 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.446002 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b473870f-0624-4865-9f33-beac5c1dc0e5-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-rkf8c\" (UID: \"b473870f-0624-4865-9f33-beac5c1dc0e5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rkf8c" Apr 22 16:49:09.446167 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.446095 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66762\" (UniqueName: \"kubernetes.io/projected/b473870f-0624-4865-9f33-beac5c1dc0e5-kube-api-access-66762\") pod \"cert-manager-operator-controller-manager-54b9655956-rkf8c\" (UID: \"b473870f-0624-4865-9f33-beac5c1dc0e5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rkf8c" Apr 22 16:49:09.546759 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.546727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66762\" (UniqueName: \"kubernetes.io/projected/b473870f-0624-4865-9f33-beac5c1dc0e5-kube-api-access-66762\") pod \"cert-manager-operator-controller-manager-54b9655956-rkf8c\" (UID: \"b473870f-0624-4865-9f33-beac5c1dc0e5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rkf8c" Apr 22 16:49:09.546880 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.546782 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b473870f-0624-4865-9f33-beac5c1dc0e5-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-rkf8c\" (UID: \"b473870f-0624-4865-9f33-beac5c1dc0e5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rkf8c" Apr 22 16:49:09.547128 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.547109 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b473870f-0624-4865-9f33-beac5c1dc0e5-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-rkf8c\" (UID: \"b473870f-0624-4865-9f33-beac5c1dc0e5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rkf8c" Apr 22 16:49:09.554655 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.554632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66762\" (UniqueName: \"kubernetes.io/projected/b473870f-0624-4865-9f33-beac5c1dc0e5-kube-api-access-66762\") pod \"cert-manager-operator-controller-manager-54b9655956-rkf8c\" (UID: \"b473870f-0624-4865-9f33-beac5c1dc0e5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rkf8c" Apr 22 16:49:09.689117 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.689090 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rkf8c" Apr 22 16:49:09.854340 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:09.854314 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rkf8c"] Apr 22 16:49:09.856716 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:49:09.856685 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb473870f_0624_4865_9f33_beac5c1dc0e5.slice/crio-e5f80a47b89ffe1c5abb72b5bfb45f09d799379e648eef9c36c5147810d8179e WatchSource:0}: Error finding container e5f80a47b89ffe1c5abb72b5bfb45f09d799379e648eef9c36c5147810d8179e: Status 404 returned error can't find the container with id e5f80a47b89ffe1c5abb72b5bfb45f09d799379e648eef9c36c5147810d8179e Apr 22 16:49:10.779379 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:10.779335 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rkf8c" event={"ID":"b473870f-0624-4865-9f33-beac5c1dc0e5","Type":"ContainerStarted","Data":"e5f80a47b89ffe1c5abb72b5bfb45f09d799379e648eef9c36c5147810d8179e"} Apr 22 16:49:11.784707 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:11.784628 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rkf8c" event={"ID":"b473870f-0624-4865-9f33-beac5c1dc0e5","Type":"ContainerStarted","Data":"24714766baf5fdb2e962e0a0df52723ed02f95de178affd5176f59cc7da83beb"} Apr 22 16:49:11.812150 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:11.812098 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-rkf8c" podStartSLOduration=1.1753429579999999 podStartE2EDuration="2.812082095s" podCreationTimestamp="2026-04-22 16:49:09 +0000 UTC" firstStartedPulling="2026-04-22 16:49:09.859912081 +0000 UTC m=+1649.561930168" lastFinishedPulling="2026-04-22 16:49:11.496651214 +0000 UTC m=+1651.198669305" observedRunningTime="2026-04-22 16:49:11.810434721 +0000 UTC m=+1651.512452830" watchObservedRunningTime="2026-04-22 16:49:11.812082095 +0000 UTC m=+1651.514100206" Apr 22 16:49:13.256533 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.256500 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4"] Apr 22 16:49:13.260232 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.260210 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" Apr 22 16:49:13.262972 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.262952 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 16:49:13.263206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.263188 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 16:49:13.265187 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.265158 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lwknc\"" Apr 22 16:49:13.275989 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.275967 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4"] Apr 22 16:49:13.281456 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.281435 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f955c79c-a1d6-44db-93ea-68cc1c154da3-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4\" (UID: \"f955c79c-a1d6-44db-93ea-68cc1c154da3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" Apr 22 16:49:13.281552 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.281476 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f955c79c-a1d6-44db-93ea-68cc1c154da3-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4\" (UID: \"f955c79c-a1d6-44db-93ea-68cc1c154da3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" Apr 22 16:49:13.281552 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.281537 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9q9p\" (UniqueName: \"kubernetes.io/projected/f955c79c-a1d6-44db-93ea-68cc1c154da3-kube-api-access-n9q9p\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4\" (UID: \"f955c79c-a1d6-44db-93ea-68cc1c154da3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" Apr 22 16:49:13.382360 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.382325 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f955c79c-a1d6-44db-93ea-68cc1c154da3-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4\" (UID: \"f955c79c-a1d6-44db-93ea-68cc1c154da3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" Apr 22 16:49:13.382542 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.382367 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f955c79c-a1d6-44db-93ea-68cc1c154da3-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4\" (UID: \"f955c79c-a1d6-44db-93ea-68cc1c154da3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" Apr 22 16:49:13.382542 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.382406 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9q9p\" (UniqueName: \"kubernetes.io/projected/f955c79c-a1d6-44db-93ea-68cc1c154da3-kube-api-access-n9q9p\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4\" (UID: \"f955c79c-a1d6-44db-93ea-68cc1c154da3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" Apr 22 16:49:13.382696 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.382674 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f955c79c-a1d6-44db-93ea-68cc1c154da3-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4\" (UID: \"f955c79c-a1d6-44db-93ea-68cc1c154da3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" Apr 22 16:49:13.382745 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.382720 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f955c79c-a1d6-44db-93ea-68cc1c154da3-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4\" (UID: \"f955c79c-a1d6-44db-93ea-68cc1c154da3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" Apr 22 16:49:13.402273 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.402246 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9q9p\" (UniqueName: \"kubernetes.io/projected/f955c79c-a1d6-44db-93ea-68cc1c154da3-kube-api-access-n9q9p\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4\" (UID: \"f955c79c-a1d6-44db-93ea-68cc1c154da3\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" Apr 22 16:49:13.569962 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.569894 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" Apr 22 16:49:13.695098 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.695073 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4"] Apr 22 16:49:13.697432 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:49:13.697405 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf955c79c_a1d6_44db_93ea_68cc1c154da3.slice/crio-4ff702a62872c4bfc9d80035c4822c69d5c67d53c1211ecdc14dea850b56f559 WatchSource:0}: Error finding container 4ff702a62872c4bfc9d80035c4822c69d5c67d53c1211ecdc14dea850b56f559: Status 404 returned error can't find the container with id 4ff702a62872c4bfc9d80035c4822c69d5c67d53c1211ecdc14dea850b56f559 Apr 22 16:49:13.785315 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.785280 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-kff8d"] Apr 22 16:49:13.788446 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.788425 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-kff8d" Apr 22 16:49:13.791180 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.791158 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 16:49:13.791323 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.791228 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-l4vtr\"" Apr 22 16:49:13.791441 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.791405 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 16:49:13.793967 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.793934 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" event={"ID":"f955c79c-a1d6-44db-93ea-68cc1c154da3","Type":"ContainerStarted","Data":"af49286b898bf45b2a588a402af13ca10e4cf71245274768bfb302c2b95aa078"} Apr 22 16:49:13.794092 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.793980 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" event={"ID":"f955c79c-a1d6-44db-93ea-68cc1c154da3","Type":"ContainerStarted","Data":"4ff702a62872c4bfc9d80035c4822c69d5c67d53c1211ecdc14dea850b56f559"} Apr 22 16:49:13.806943 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.806921 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-kff8d"] Apr 22 16:49:13.886236 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.886140 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9t44\" (UniqueName: \"kubernetes.io/projected/ea8a77be-bb69-43cd-8c9c-571998741217-kube-api-access-c9t44\") pod \"cert-manager-webhook-587ccfb98-kff8d\" (UID: \"ea8a77be-bb69-43cd-8c9c-571998741217\") " pod="cert-manager/cert-manager-webhook-587ccfb98-kff8d" Apr 22 16:49:13.886406 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.886281 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea8a77be-bb69-43cd-8c9c-571998741217-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-kff8d\" (UID: \"ea8a77be-bb69-43cd-8c9c-571998741217\") " pod="cert-manager/cert-manager-webhook-587ccfb98-kff8d" Apr 22 16:49:13.987609 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.987575 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea8a77be-bb69-43cd-8c9c-571998741217-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-kff8d\" (UID: \"ea8a77be-bb69-43cd-8c9c-571998741217\") " pod="cert-manager/cert-manager-webhook-587ccfb98-kff8d" Apr 22 16:49:13.987794 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.987635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9t44\" (UniqueName: \"kubernetes.io/projected/ea8a77be-bb69-43cd-8c9c-571998741217-kube-api-access-c9t44\") pod \"cert-manager-webhook-587ccfb98-kff8d\" (UID: \"ea8a77be-bb69-43cd-8c9c-571998741217\") " pod="cert-manager/cert-manager-webhook-587ccfb98-kff8d" Apr 22 16:49:13.999503 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.999473 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea8a77be-bb69-43cd-8c9c-571998741217-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-kff8d\" (UID: \"ea8a77be-bb69-43cd-8c9c-571998741217\") " pod="cert-manager/cert-manager-webhook-587ccfb98-kff8d" Apr 22 16:49:13.999622 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:13.999604 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9t44\" (UniqueName: \"kubernetes.io/projected/ea8a77be-bb69-43cd-8c9c-571998741217-kube-api-access-c9t44\") pod \"cert-manager-webhook-587ccfb98-kff8d\" (UID: \"ea8a77be-bb69-43cd-8c9c-571998741217\") " pod="cert-manager/cert-manager-webhook-587ccfb98-kff8d" Apr 22 16:49:14.109458 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:14.109426 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-kff8d" Apr 22 16:49:14.231172 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:14.231136 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-kff8d"] Apr 22 16:49:14.233771 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:49:14.233746 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea8a77be_bb69_43cd_8c9c_571998741217.slice/crio-ae69c5511893b8ef00b3ffdcf776291d5e08b351a1384be4737765218af75998 WatchSource:0}: Error finding container ae69c5511893b8ef00b3ffdcf776291d5e08b351a1384be4737765218af75998: Status 404 returned error can't find the container with id ae69c5511893b8ef00b3ffdcf776291d5e08b351a1384be4737765218af75998 Apr 22 16:49:14.799580 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:14.799550 2575 generic.go:358] "Generic (PLEG): container finished" podID="f955c79c-a1d6-44db-93ea-68cc1c154da3" containerID="af49286b898bf45b2a588a402af13ca10e4cf71245274768bfb302c2b95aa078" exitCode=0 Apr 22 16:49:14.799939 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:14.799610 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" event={"ID":"f955c79c-a1d6-44db-93ea-68cc1c154da3","Type":"ContainerDied","Data":"af49286b898bf45b2a588a402af13ca10e4cf71245274768bfb302c2b95aa078"} Apr 22 16:49:14.803445 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:14.803383 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-kff8d" event={"ID":"ea8a77be-bb69-43cd-8c9c-571998741217","Type":"ContainerStarted","Data":"ae69c5511893b8ef00b3ffdcf776291d5e08b351a1384be4737765218af75998"} Apr 22 16:49:17.816060 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:17.816001 2575 generic.go:358] "Generic (PLEG): container finished" podID="f955c79c-a1d6-44db-93ea-68cc1c154da3" containerID="4c88d5a85114b1e5f3bb41d8a7316c3ac5a5ac00266f29164b966852e6c8ac3b" exitCode=0 Apr 22 16:49:17.816429 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:17.816080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" event={"ID":"f955c79c-a1d6-44db-93ea-68cc1c154da3","Type":"ContainerDied","Data":"4c88d5a85114b1e5f3bb41d8a7316c3ac5a5ac00266f29164b966852e6c8ac3b"} Apr 22 16:49:18.821793 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:18.821756 2575 generic.go:358] "Generic (PLEG): container finished" podID="f955c79c-a1d6-44db-93ea-68cc1c154da3" containerID="cb5eb21021503080b1d81a0f7efc1843790e415cede8061e351b5c2d7dee9da9" exitCode=0 Apr 22 16:49:18.822205 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:18.821800 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" event={"ID":"f955c79c-a1d6-44db-93ea-68cc1c154da3","Type":"ContainerDied","Data":"cb5eb21021503080b1d81a0f7efc1843790e415cede8061e351b5c2d7dee9da9"} Apr 22 16:49:19.949532 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:19.949512 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" Apr 22 16:49:20.040500 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:20.040466 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f955c79c-a1d6-44db-93ea-68cc1c154da3-util\") pod \"f955c79c-a1d6-44db-93ea-68cc1c154da3\" (UID: \"f955c79c-a1d6-44db-93ea-68cc1c154da3\") " Apr 22 16:49:20.040683 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:20.040512 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f955c79c-a1d6-44db-93ea-68cc1c154da3-bundle\") pod \"f955c79c-a1d6-44db-93ea-68cc1c154da3\" (UID: \"f955c79c-a1d6-44db-93ea-68cc1c154da3\") " Apr 22 16:49:20.040683 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:20.040587 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9q9p\" (UniqueName: \"kubernetes.io/projected/f955c79c-a1d6-44db-93ea-68cc1c154da3-kube-api-access-n9q9p\") pod \"f955c79c-a1d6-44db-93ea-68cc1c154da3\" (UID: \"f955c79c-a1d6-44db-93ea-68cc1c154da3\") " Apr 22 16:49:20.040902 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:20.040881 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f955c79c-a1d6-44db-93ea-68cc1c154da3-bundle" (OuterVolumeSpecName: "bundle") pod "f955c79c-a1d6-44db-93ea-68cc1c154da3" (UID: "f955c79c-a1d6-44db-93ea-68cc1c154da3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:49:20.042700 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:20.042668 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f955c79c-a1d6-44db-93ea-68cc1c154da3-kube-api-access-n9q9p" (OuterVolumeSpecName: "kube-api-access-n9q9p") pod "f955c79c-a1d6-44db-93ea-68cc1c154da3" (UID: "f955c79c-a1d6-44db-93ea-68cc1c154da3"). InnerVolumeSpecName "kube-api-access-n9q9p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:49:20.045211 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:20.045167 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f955c79c-a1d6-44db-93ea-68cc1c154da3-util" (OuterVolumeSpecName: "util") pod "f955c79c-a1d6-44db-93ea-68cc1c154da3" (UID: "f955c79c-a1d6-44db-93ea-68cc1c154da3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:49:20.141758 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:20.141683 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f955c79c-a1d6-44db-93ea-68cc1c154da3-util\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:49:20.141758 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:20.141707 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f955c79c-a1d6-44db-93ea-68cc1c154da3-bundle\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:49:20.141758 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:20.141716 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n9q9p\" (UniqueName: \"kubernetes.io/projected/f955c79c-a1d6-44db-93ea-68cc1c154da3-kube-api-access-n9q9p\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:49:20.829959 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:20.829928 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" Apr 22 16:49:20.830144 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:20.829933 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fj9vk4" event={"ID":"f955c79c-a1d6-44db-93ea-68cc1c154da3","Type":"ContainerDied","Data":"4ff702a62872c4bfc9d80035c4822c69d5c67d53c1211ecdc14dea850b56f559"} Apr 22 16:49:20.830144 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:20.830053 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ff702a62872c4bfc9d80035c4822c69d5c67d53c1211ecdc14dea850b56f559" Apr 22 16:49:23.843117 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:23.843080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-kff8d" event={"ID":"ea8a77be-bb69-43cd-8c9c-571998741217","Type":"ContainerStarted","Data":"28f547f62210c3dd87795408ea537ce4214e0964aa8d50e6aaaf1b64cdc40bc6"} Apr 22 16:49:23.843480 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:23.843170 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-kff8d" Apr 22 16:49:23.866244 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:23.866198 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-kff8d" podStartSLOduration=1.677871815 podStartE2EDuration="10.866184976s" podCreationTimestamp="2026-04-22 16:49:13 +0000 UTC" firstStartedPulling="2026-04-22 16:49:14.2355689 +0000 UTC m=+1653.937586990" lastFinishedPulling="2026-04-22 16:49:23.423882064 +0000 UTC m=+1663.125900151" observedRunningTime="2026-04-22 16:49:23.86454775 +0000 UTC m=+1663.566565852" watchObservedRunningTime="2026-04-22 16:49:23.866184976 +0000 UTC m=+1663.568203084" Apr 22 16:49:29.849390 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:29.849354 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-kff8d" Apr 22 16:49:33.873680 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:33.873643 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h"] Apr 22 16:49:33.874217 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:33.874016 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f955c79c-a1d6-44db-93ea-68cc1c154da3" containerName="util" Apr 22 16:49:33.874217 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:33.874029 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f955c79c-a1d6-44db-93ea-68cc1c154da3" containerName="util" Apr 22 16:49:33.874217 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:33.874053 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f955c79c-a1d6-44db-93ea-68cc1c154da3" containerName="pull" Apr 22 16:49:33.874217 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:33.874059 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f955c79c-a1d6-44db-93ea-68cc1c154da3" containerName="pull" Apr 22 16:49:33.874217 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:33.874076 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f955c79c-a1d6-44db-93ea-68cc1c154da3" containerName="extract" Apr 22 16:49:33.874217 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:33.874082 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f955c79c-a1d6-44db-93ea-68cc1c154da3" containerName="extract" Apr 22 16:49:33.874217 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:33.874136 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f955c79c-a1d6-44db-93ea-68cc1c154da3" containerName="extract" Apr 22 16:49:33.885291 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:33.885266 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" Apr 22 16:49:33.885291 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:33.885281 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h"] Apr 22 16:49:33.888152 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:33.887986 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 16:49:33.888152 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:33.888055 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 16:49:33.889078 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:33.889061 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lwknc\"" Apr 22 16:49:33.964854 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:33.964819 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfwf6\" (UniqueName: \"kubernetes.io/projected/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-kube-api-access-gfwf6\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h\" (UID: \"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" Apr 22 16:49:33.965005 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:33.964863 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h\" (UID: \"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" Apr 22 16:49:33.965005 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:33.964990 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h\" (UID: \"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" Apr 22 16:49:34.066466 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:34.066436 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h\" (UID: \"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" Apr 22 16:49:34.066609 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:34.066490 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfwf6\" (UniqueName: \"kubernetes.io/projected/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-kube-api-access-gfwf6\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h\" (UID: \"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" Apr 22 16:49:34.066609 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:34.066514 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h\" (UID: \"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" Apr 22 16:49:34.066813 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:34.066794 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h\" (UID: \"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" Apr 22 16:49:34.066847 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:34.066823 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h\" (UID: \"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" Apr 22 16:49:34.079938 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:34.079910 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfwf6\" (UniqueName: \"kubernetes.io/projected/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-kube-api-access-gfwf6\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h\" (UID: \"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" Apr 22 16:49:34.195876 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:34.195847 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" Apr 22 16:49:34.317125 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:34.317103 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h"] Apr 22 16:49:34.319205 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:49:34.319178 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a6797b0_0f7a_4fbc_8e2f_5a10f46b2576.slice/crio-39d412dba1b9d911fce792b0ee30d52f159320aec638f5053f4d2b01b535aee4 WatchSource:0}: Error finding container 39d412dba1b9d911fce792b0ee30d52f159320aec638f5053f4d2b01b535aee4: Status 404 returned error can't find the container with id 39d412dba1b9d911fce792b0ee30d52f159320aec638f5053f4d2b01b535aee4 Apr 22 16:49:34.880531 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:34.880497 2575 generic.go:358] "Generic (PLEG): container finished" podID="3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576" containerID="0f931da121d40fd142211af30c1f13c91b38438c8fa7e2734128b1e312eab54a" exitCode=0 Apr 22 16:49:34.880922 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:34.880592 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" event={"ID":"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576","Type":"ContainerDied","Data":"0f931da121d40fd142211af30c1f13c91b38438c8fa7e2734128b1e312eab54a"} Apr 22 16:49:34.880922 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:34.880632 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" event={"ID":"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576","Type":"ContainerStarted","Data":"39d412dba1b9d911fce792b0ee30d52f159320aec638f5053f4d2b01b535aee4"} Apr 22 16:49:35.885871 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:35.885787 2575 generic.go:358] "Generic (PLEG): container finished" podID="3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576" containerID="913297189d350b8fb5053ee0acddd72a7a8ddf21092ae28cd24ee3d5366faf39" exitCode=0 Apr 22 16:49:35.885871 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:35.885834 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" event={"ID":"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576","Type":"ContainerDied","Data":"913297189d350b8fb5053ee0acddd72a7a8ddf21092ae28cd24ee3d5366faf39"} Apr 22 16:49:36.893644 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:36.893607 2575 generic.go:358] "Generic (PLEG): container finished" podID="3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576" containerID="55cf4204d18d95b7ef31d08d78e2b6e5b4239d85833b0890f931cdc8e1c36fbd" exitCode=0 Apr 22 16:49:36.894010 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:36.893685 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" event={"ID":"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576","Type":"ContainerDied","Data":"55cf4204d18d95b7ef31d08d78e2b6e5b4239d85833b0890f931cdc8e1c36fbd"} Apr 22 16:49:38.014487 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:38.014462 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" Apr 22 16:49:38.101826 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:38.101794 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-util\") pod \"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576\" (UID: \"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576\") " Apr 22 16:49:38.101991 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:38.101839 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfwf6\" (UniqueName: \"kubernetes.io/projected/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-kube-api-access-gfwf6\") pod \"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576\" (UID: \"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576\") " Apr 22 16:49:38.101991 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:38.101879 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-bundle\") pod \"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576\" (UID: \"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576\") " Apr 22 16:49:38.102653 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:38.102616 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-bundle" (OuterVolumeSpecName: "bundle") pod "3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576" (UID: "3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:49:38.104088 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:38.104068 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-kube-api-access-gfwf6" (OuterVolumeSpecName: "kube-api-access-gfwf6") pod "3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576" (UID: "3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576"). InnerVolumeSpecName "kube-api-access-gfwf6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:49:38.107191 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:38.107158 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-util" (OuterVolumeSpecName: "util") pod "3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576" (UID: "3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:49:38.203391 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:38.203369 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-bundle\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:49:38.203391 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:38.203393 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-util\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:49:38.203554 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:38.203402 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gfwf6\" (UniqueName: \"kubernetes.io/projected/3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576-kube-api-access-gfwf6\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:49:38.902188 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:38.902154 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" Apr 22 16:49:38.904979 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:38.904949 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54d22h" event={"ID":"3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576","Type":"ContainerDied","Data":"39d412dba1b9d911fce792b0ee30d52f159320aec638f5053f4d2b01b535aee4"} Apr 22 16:49:38.904979 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:38.904982 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39d412dba1b9d911fce792b0ee30d52f159320aec638f5053f4d2b01b535aee4" Apr 22 16:49:44.694208 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.694170 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm"] Apr 22 16:49:44.694564 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.694516 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576" containerName="extract" Apr 22 16:49:44.694564 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.694527 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576" containerName="extract" Apr 22 16:49:44.694564 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.694549 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576" containerName="util" Apr 22 16:49:44.694564 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.694553 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576" containerName="util" Apr 22 16:49:44.694564 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.694565 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576" containerName="pull" Apr 22 16:49:44.694713 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.694570 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576" containerName="pull" Apr 22 16:49:44.694713 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.694635 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a6797b0-0f7a-4fbc-8e2f-5a10f46b2576" containerName="extract" Apr 22 16:49:44.698917 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.698897 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" Apr 22 16:49:44.701556 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.701531 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 16:49:44.702487 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.702469 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 16:49:44.702585 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.702470 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lwknc\"" Apr 22 16:49:44.708274 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.708250 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm"] Apr 22 16:49:44.857626 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.857590 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm\" (UID: \"1c2ca3a4-f693-4d56-8184-3e069d99e1f3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" Apr 22 16:49:44.857789 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.857648 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrv6s\" (UniqueName: \"kubernetes.io/projected/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-kube-api-access-lrv6s\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm\" (UID: \"1c2ca3a4-f693-4d56-8184-3e069d99e1f3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" Apr 22 16:49:44.857789 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.857692 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm\" (UID: \"1c2ca3a4-f693-4d56-8184-3e069d99e1f3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" Apr 22 16:49:44.958407 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.958326 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm\" (UID: \"1c2ca3a4-f693-4d56-8184-3e069d99e1f3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" Apr 22 16:49:44.958407 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.958371 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrv6s\" (UniqueName: \"kubernetes.io/projected/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-kube-api-access-lrv6s\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm\" (UID: \"1c2ca3a4-f693-4d56-8184-3e069d99e1f3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" Apr 22 16:49:44.958407 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.958398 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm\" (UID: \"1c2ca3a4-f693-4d56-8184-3e069d99e1f3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" Apr 22 16:49:44.958745 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.958730 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm\" (UID: \"1c2ca3a4-f693-4d56-8184-3e069d99e1f3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" Apr 22 16:49:44.958796 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.958733 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm\" (UID: \"1c2ca3a4-f693-4d56-8184-3e069d99e1f3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" Apr 22 16:49:44.979456 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:44.979433 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrv6s\" (UniqueName: \"kubernetes.io/projected/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-kube-api-access-lrv6s\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm\" (UID: \"1c2ca3a4-f693-4d56-8184-3e069d99e1f3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" Apr 22 16:49:45.009335 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:45.009310 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" Apr 22 16:49:45.137363 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:45.137340 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm"] Apr 22 16:49:45.139545 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:49:45.139514 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c2ca3a4_f693_4d56_8184_3e069d99e1f3.slice/crio-228fbbdac2ed15573dfe6e48526d95af8909f8cc1e6038acdcb7ce54cd5740f6 WatchSource:0}: Error finding container 228fbbdac2ed15573dfe6e48526d95af8909f8cc1e6038acdcb7ce54cd5740f6: Status 404 returned error can't find the container with id 228fbbdac2ed15573dfe6e48526d95af8909f8cc1e6038acdcb7ce54cd5740f6 Apr 22 16:49:45.935026 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:45.934988 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" event={"ID":"1c2ca3a4-f693-4d56-8184-3e069d99e1f3","Type":"ContainerDied","Data":"912fa78dae84229edd0624f5d3cd8b492c41cce5cd9d35d479b16a66959b9aa1"} Apr 22 16:49:45.935412 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:45.934985 2575 generic.go:358] "Generic (PLEG): container finished" podID="1c2ca3a4-f693-4d56-8184-3e069d99e1f3" containerID="912fa78dae84229edd0624f5d3cd8b492c41cce5cd9d35d479b16a66959b9aa1" exitCode=0 Apr 22 16:49:45.935412 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:45.935122 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" event={"ID":"1c2ca3a4-f693-4d56-8184-3e069d99e1f3","Type":"ContainerStarted","Data":"228fbbdac2ed15573dfe6e48526d95af8909f8cc1e6038acdcb7ce54cd5740f6"} Apr 22 16:49:46.580489 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.580465 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf"] Apr 22 16:49:46.583631 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.583614 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" Apr 22 16:49:46.586217 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.586198 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 16:49:46.586612 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.586588 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-wf8fc\"" Apr 22 16:49:46.586818 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.586802 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 16:49:46.587018 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.586911 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 16:49:46.587018 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.586987 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 16:49:46.587165 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.587084 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:49:46.602365 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.602329 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf"] Apr 22 16:49:46.672123 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.672095 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmrpw\" (UniqueName: \"kubernetes.io/projected/b489fa58-5f6c-4bbf-ba63-a73c1f64f28e-kube-api-access-zmrpw\") pod \"lws-controller-manager-7c5749599b-gjqlf\" (UID: \"b489fa58-5f6c-4bbf-ba63-a73c1f64f28e\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" Apr 22 16:49:46.672286 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.672153 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b489fa58-5f6c-4bbf-ba63-a73c1f64f28e-cert\") pod \"lws-controller-manager-7c5749599b-gjqlf\" (UID: \"b489fa58-5f6c-4bbf-ba63-a73c1f64f28e\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" Apr 22 16:49:46.672286 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.672196 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b489fa58-5f6c-4bbf-ba63-a73c1f64f28e-manager-config\") pod \"lws-controller-manager-7c5749599b-gjqlf\" (UID: \"b489fa58-5f6c-4bbf-ba63-a73c1f64f28e\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" Apr 22 16:49:46.672286 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.672239 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b489fa58-5f6c-4bbf-ba63-a73c1f64f28e-metrics-cert\") pod \"lws-controller-manager-7c5749599b-gjqlf\" (UID: \"b489fa58-5f6c-4bbf-ba63-a73c1f64f28e\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" Apr 22 16:49:46.772953 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.772871 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b489fa58-5f6c-4bbf-ba63-a73c1f64f28e-manager-config\") pod \"lws-controller-manager-7c5749599b-gjqlf\" (UID: \"b489fa58-5f6c-4bbf-ba63-a73c1f64f28e\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" Apr 22 16:49:46.772953 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.772923 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b489fa58-5f6c-4bbf-ba63-a73c1f64f28e-metrics-cert\") pod \"lws-controller-manager-7c5749599b-gjqlf\" (UID: \"b489fa58-5f6c-4bbf-ba63-a73c1f64f28e\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" Apr 22 16:49:46.772953 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.772944 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmrpw\" (UniqueName: \"kubernetes.io/projected/b489fa58-5f6c-4bbf-ba63-a73c1f64f28e-kube-api-access-zmrpw\") pod \"lws-controller-manager-7c5749599b-gjqlf\" (UID: \"b489fa58-5f6c-4bbf-ba63-a73c1f64f28e\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" Apr 22 16:49:46.773241 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.772980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b489fa58-5f6c-4bbf-ba63-a73c1f64f28e-cert\") pod \"lws-controller-manager-7c5749599b-gjqlf\" (UID: \"b489fa58-5f6c-4bbf-ba63-a73c1f64f28e\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" Apr 22 16:49:46.773527 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.773499 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b489fa58-5f6c-4bbf-ba63-a73c1f64f28e-manager-config\") pod \"lws-controller-manager-7c5749599b-gjqlf\" (UID: \"b489fa58-5f6c-4bbf-ba63-a73c1f64f28e\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" Apr 22 16:49:46.775483 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.775464 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b489fa58-5f6c-4bbf-ba63-a73c1f64f28e-metrics-cert\") pod \"lws-controller-manager-7c5749599b-gjqlf\" (UID: \"b489fa58-5f6c-4bbf-ba63-a73c1f64f28e\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" Apr 22 16:49:46.775691 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.775670 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b489fa58-5f6c-4bbf-ba63-a73c1f64f28e-cert\") pod \"lws-controller-manager-7c5749599b-gjqlf\" (UID: \"b489fa58-5f6c-4bbf-ba63-a73c1f64f28e\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" Apr 22 16:49:46.784955 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.784932 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmrpw\" (UniqueName: \"kubernetes.io/projected/b489fa58-5f6c-4bbf-ba63-a73c1f64f28e-kube-api-access-zmrpw\") pod \"lws-controller-manager-7c5749599b-gjqlf\" (UID: \"b489fa58-5f6c-4bbf-ba63-a73c1f64f28e\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" Apr 22 16:49:46.877266 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.877237 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc"] Apr 22 16:49:46.880750 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.880729 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc" Apr 22 16:49:46.885876 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.885854 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 16:49:46.885991 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.885930 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 16:49:46.886502 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.886482 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-b4qgt\"" Apr 22 16:49:46.886736 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.886722 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 16:49:46.886966 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.886949 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 16:49:46.892325 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.892309 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" Apr 22 16:49:46.906306 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.906281 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc"] Apr 22 16:49:46.943025 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.942005 2575 generic.go:358] "Generic (PLEG): container finished" podID="1c2ca3a4-f693-4d56-8184-3e069d99e1f3" containerID="c538c5bde502f724117224ceec19b2a2ab6b908ed8ff84b6581611dc268c8c3a" exitCode=0 Apr 22 16:49:46.943025 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.942151 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" event={"ID":"1c2ca3a4-f693-4d56-8184-3e069d99e1f3","Type":"ContainerDied","Data":"c538c5bde502f724117224ceec19b2a2ab6b908ed8ff84b6581611dc268c8c3a"} Apr 22 16:49:46.974144 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.974072 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c3435da7-2b7c-47d4-b6f4-dc09140dd90e-webhook-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-b8pdc\" (UID: \"c3435da7-2b7c-47d4-b6f4-dc09140dd90e\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc" Apr 22 16:49:46.974144 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.974128 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c3435da7-2b7c-47d4-b6f4-dc09140dd90e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-b8pdc\" (UID: \"c3435da7-2b7c-47d4-b6f4-dc09140dd90e\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc" Apr 22 16:49:46.974354 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:46.974182 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkvdz\" (UniqueName: \"kubernetes.io/projected/c3435da7-2b7c-47d4-b6f4-dc09140dd90e-kube-api-access-wkvdz\") pod \"opendatahub-operator-controller-manager-57c8d5d679-b8pdc\" (UID: \"c3435da7-2b7c-47d4-b6f4-dc09140dd90e\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc" Apr 22 16:49:47.047570 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:47.047540 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf"] Apr 22 16:49:47.048083 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:49:47.048027 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb489fa58_5f6c_4bbf_ba63_a73c1f64f28e.slice/crio-5df77fb3455da5b4efbfcdb4a3a2546f55de25114ee0749fdd12fb27ac8732e8 WatchSource:0}: Error finding container 5df77fb3455da5b4efbfcdb4a3a2546f55de25114ee0749fdd12fb27ac8732e8: Status 404 returned error can't find the container with id 5df77fb3455da5b4efbfcdb4a3a2546f55de25114ee0749fdd12fb27ac8732e8 Apr 22 16:49:47.075118 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:47.075087 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c3435da7-2b7c-47d4-b6f4-dc09140dd90e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-b8pdc\" (UID: \"c3435da7-2b7c-47d4-b6f4-dc09140dd90e\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc" Apr 22 16:49:47.075287 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:47.075257 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkvdz\" (UniqueName: \"kubernetes.io/projected/c3435da7-2b7c-47d4-b6f4-dc09140dd90e-kube-api-access-wkvdz\") pod \"opendatahub-operator-controller-manager-57c8d5d679-b8pdc\" (UID: \"c3435da7-2b7c-47d4-b6f4-dc09140dd90e\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc" Apr 22 16:49:47.075723 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:47.075697 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c3435da7-2b7c-47d4-b6f4-dc09140dd90e-webhook-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-b8pdc\" (UID: \"c3435da7-2b7c-47d4-b6f4-dc09140dd90e\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc" Apr 22 16:49:47.078379 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:47.078357 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c3435da7-2b7c-47d4-b6f4-dc09140dd90e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-b8pdc\" (UID: \"c3435da7-2b7c-47d4-b6f4-dc09140dd90e\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc" Apr 22 16:49:47.078454 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:47.078383 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c3435da7-2b7c-47d4-b6f4-dc09140dd90e-webhook-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-b8pdc\" (UID: \"c3435da7-2b7c-47d4-b6f4-dc09140dd90e\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc" Apr 22 16:49:47.086906 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:47.086880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkvdz\" (UniqueName: \"kubernetes.io/projected/c3435da7-2b7c-47d4-b6f4-dc09140dd90e-kube-api-access-wkvdz\") pod \"opendatahub-operator-controller-manager-57c8d5d679-b8pdc\" (UID: \"c3435da7-2b7c-47d4-b6f4-dc09140dd90e\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc" Apr 22 16:49:47.192207 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:47.192180 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc" Apr 22 16:49:47.321252 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:47.321226 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc"] Apr 22 16:49:47.323182 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:49:47.323145 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3435da7_2b7c_47d4_b6f4_dc09140dd90e.slice/crio-2b3093a878cd4d36781cf1fb34da1ccf971311c4a7c77db717f33601526d84d6 WatchSource:0}: Error finding container 2b3093a878cd4d36781cf1fb34da1ccf971311c4a7c77db717f33601526d84d6: Status 404 returned error can't find the container with id 2b3093a878cd4d36781cf1fb34da1ccf971311c4a7c77db717f33601526d84d6 Apr 22 16:49:47.949181 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:47.949128 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" event={"ID":"b489fa58-5f6c-4bbf-ba63-a73c1f64f28e","Type":"ContainerStarted","Data":"5df77fb3455da5b4efbfcdb4a3a2546f55de25114ee0749fdd12fb27ac8732e8"} Apr 22 16:49:47.951472 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:47.951439 2575 generic.go:358] "Generic (PLEG): container finished" podID="1c2ca3a4-f693-4d56-8184-3e069d99e1f3" containerID="c8e275f77550533685e42d6191fa80bbfae0e8624a946909c601f4628f77e49c" exitCode=0 Apr 22 16:49:47.951607 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:47.951554 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" event={"ID":"1c2ca3a4-f693-4d56-8184-3e069d99e1f3","Type":"ContainerDied","Data":"c8e275f77550533685e42d6191fa80bbfae0e8624a946909c601f4628f77e49c"} Apr 22 16:49:47.953192 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:47.953157 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc" event={"ID":"c3435da7-2b7c-47d4-b6f4-dc09140dd90e","Type":"ContainerStarted","Data":"2b3093a878cd4d36781cf1fb34da1ccf971311c4a7c77db717f33601526d84d6"} Apr 22 16:49:49.487034 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:49.487008 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" Apr 22 16:49:49.600034 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:49.600001 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-bundle\") pod \"1c2ca3a4-f693-4d56-8184-3e069d99e1f3\" (UID: \"1c2ca3a4-f693-4d56-8184-3e069d99e1f3\") " Apr 22 16:49:49.600224 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:49.600105 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrv6s\" (UniqueName: \"kubernetes.io/projected/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-kube-api-access-lrv6s\") pod \"1c2ca3a4-f693-4d56-8184-3e069d99e1f3\" (UID: \"1c2ca3a4-f693-4d56-8184-3e069d99e1f3\") " Apr 22 16:49:49.600224 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:49.600172 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-util\") pod \"1c2ca3a4-f693-4d56-8184-3e069d99e1f3\" (UID: \"1c2ca3a4-f693-4d56-8184-3e069d99e1f3\") " Apr 22 16:49:49.601390 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:49.601361 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-bundle" (OuterVolumeSpecName: "bundle") pod "1c2ca3a4-f693-4d56-8184-3e069d99e1f3" (UID: "1c2ca3a4-f693-4d56-8184-3e069d99e1f3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:49:49.603216 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:49.603179 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-kube-api-access-lrv6s" (OuterVolumeSpecName: "kube-api-access-lrv6s") pod "1c2ca3a4-f693-4d56-8184-3e069d99e1f3" (UID: "1c2ca3a4-f693-4d56-8184-3e069d99e1f3"). InnerVolumeSpecName "kube-api-access-lrv6s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:49:49.608720 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:49.608694 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-util" (OuterVolumeSpecName: "util") pod "1c2ca3a4-f693-4d56-8184-3e069d99e1f3" (UID: "1c2ca3a4-f693-4d56-8184-3e069d99e1f3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:49:49.701412 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:49.701380 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-util\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:49:49.701412 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:49.701412 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-bundle\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:49:49.701607 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:49.701429 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lrv6s\" (UniqueName: \"kubernetes.io/projected/1c2ca3a4-f693-4d56-8184-3e069d99e1f3-kube-api-access-lrv6s\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:49:49.963689 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:49.963655 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" event={"ID":"1c2ca3a4-f693-4d56-8184-3e069d99e1f3","Type":"ContainerDied","Data":"228fbbdac2ed15573dfe6e48526d95af8909f8cc1e6038acdcb7ce54cd5740f6"} Apr 22 16:49:49.963689 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:49.963689 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="228fbbdac2ed15573dfe6e48526d95af8909f8cc1e6038acdcb7ce54cd5740f6" Apr 22 16:49:49.963907 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:49.963710 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xt6zm" Apr 22 16:49:50.968169 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:50.968137 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc" event={"ID":"c3435da7-2b7c-47d4-b6f4-dc09140dd90e","Type":"ContainerStarted","Data":"0785ed7f51ec4a6442f9bc84007e6b8b959b17c4c098aa4c23727f2dcd569396"} Apr 22 16:49:50.968649 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:50.968205 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc" Apr 22 16:49:50.969481 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:50.969454 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" event={"ID":"b489fa58-5f6c-4bbf-ba63-a73c1f64f28e","Type":"ContainerStarted","Data":"b0c86fc74f69b8d4df340734b24ea0268da616c31689b4e2afc5a7c19dfba1de"} Apr 22 16:49:50.969612 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:50.969585 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" Apr 22 16:49:50.989877 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:50.989832 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc" podStartSLOduration=2.220509781 podStartE2EDuration="4.989818938s" podCreationTimestamp="2026-04-22 16:49:46 +0000 UTC" firstStartedPulling="2026-04-22 16:49:47.32478694 +0000 UTC m=+1687.026805027" lastFinishedPulling="2026-04-22 16:49:50.094096097 +0000 UTC m=+1689.796114184" observedRunningTime="2026-04-22 16:49:50.988214886 +0000 UTC m=+1690.690232996" watchObservedRunningTime="2026-04-22 16:49:50.989818938 +0000 UTC m=+1690.691837046" Apr 22 16:49:51.006647 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:49:51.006612 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" podStartSLOduration=2.002692644 podStartE2EDuration="5.006601383s" podCreationTimestamp="2026-04-22 16:49:46 +0000 UTC" firstStartedPulling="2026-04-22 16:49:47.050122834 +0000 UTC m=+1686.752140922" lastFinishedPulling="2026-04-22 16:49:50.054031566 +0000 UTC m=+1689.756049661" observedRunningTime="2026-04-22 16:49:51.00561836 +0000 UTC m=+1690.707636468" watchObservedRunningTime="2026-04-22 16:49:51.006601383 +0000 UTC m=+1690.708619492" Apr 22 16:50:01.975519 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:01.975447 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-b8pdc" Apr 22 16:50:01.975875 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:01.975601 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7c5749599b-gjqlf" Apr 22 16:50:04.160174 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.160133 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p"] Apr 22 16:50:04.160631 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.160608 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c2ca3a4-f693-4d56-8184-3e069d99e1f3" containerName="pull" Apr 22 16:50:04.160701 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.160635 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c2ca3a4-f693-4d56-8184-3e069d99e1f3" containerName="pull" Apr 22 16:50:04.160701 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.160648 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c2ca3a4-f693-4d56-8184-3e069d99e1f3" containerName="util" Apr 22 16:50:04.160701 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.160656 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c2ca3a4-f693-4d56-8184-3e069d99e1f3" containerName="util" Apr 22 16:50:04.160701 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.160681 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c2ca3a4-f693-4d56-8184-3e069d99e1f3" containerName="extract" Apr 22 16:50:04.160701 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.160690 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c2ca3a4-f693-4d56-8184-3e069d99e1f3" containerName="extract" Apr 22 16:50:04.160934 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.160790 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c2ca3a4-f693-4d56-8184-3e069d99e1f3" containerName="extract" Apr 22 16:50:04.166739 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.166716 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" Apr 22 16:50:04.169262 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.169242 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 16:50:04.170189 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.170162 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 16:50:04.170299 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.170170 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lwknc\"" Apr 22 16:50:04.189520 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.189494 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p"] Apr 22 16:50:04.216662 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.216633 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p\" (UID: \"b2b1af28-0dae-470b-89c4-aa6ee9ff7135\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" Apr 22 16:50:04.216804 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.216677 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9glg\" (UniqueName: \"kubernetes.io/projected/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-kube-api-access-v9glg\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p\" (UID: \"b2b1af28-0dae-470b-89c4-aa6ee9ff7135\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" Apr 22 16:50:04.216804 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.216699 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p\" (UID: \"b2b1af28-0dae-470b-89c4-aa6ee9ff7135\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" Apr 22 16:50:04.317540 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.317511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p\" (UID: \"b2b1af28-0dae-470b-89c4-aa6ee9ff7135\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" Apr 22 16:50:04.317696 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.317556 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9glg\" (UniqueName: \"kubernetes.io/projected/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-kube-api-access-v9glg\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p\" (UID: \"b2b1af28-0dae-470b-89c4-aa6ee9ff7135\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" Apr 22 16:50:04.317696 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.317579 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p\" (UID: \"b2b1af28-0dae-470b-89c4-aa6ee9ff7135\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" Apr 22 16:50:04.317899 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.317882 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p\" (UID: \"b2b1af28-0dae-470b-89c4-aa6ee9ff7135\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" Apr 22 16:50:04.317975 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.317956 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p\" (UID: \"b2b1af28-0dae-470b-89c4-aa6ee9ff7135\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" Apr 22 16:50:04.331481 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.331457 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9glg\" (UniqueName: \"kubernetes.io/projected/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-kube-api-access-v9glg\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p\" (UID: \"b2b1af28-0dae-470b-89c4-aa6ee9ff7135\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" Apr 22 16:50:04.476469 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.476439 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" Apr 22 16:50:04.607776 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:04.607752 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p"] Apr 22 16:50:04.610504 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:50:04.610477 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2b1af28_0dae_470b_89c4_aa6ee9ff7135.slice/crio-e926b898dae9d03dbeb58e8ac816cf43bbf7a3e848d5c3cf841ee2c73fb1a00b WatchSource:0}: Error finding container e926b898dae9d03dbeb58e8ac816cf43bbf7a3e848d5c3cf841ee2c73fb1a00b: Status 404 returned error can't find the container with id e926b898dae9d03dbeb58e8ac816cf43bbf7a3e848d5c3cf841ee2c73fb1a00b Apr 22 16:50:05.018723 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:05.018692 2575 generic.go:358] "Generic (PLEG): container finished" podID="b2b1af28-0dae-470b-89c4-aa6ee9ff7135" containerID="16eed8603c7df48aaf05ad648d4debeee5da400b471293cc1f2050f0b9cc172d" exitCode=0 Apr 22 16:50:05.018894 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:05.018775 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" event={"ID":"b2b1af28-0dae-470b-89c4-aa6ee9ff7135","Type":"ContainerDied","Data":"16eed8603c7df48aaf05ad648d4debeee5da400b471293cc1f2050f0b9cc172d"} Apr 22 16:50:05.018894 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:05.018797 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" event={"ID":"b2b1af28-0dae-470b-89c4-aa6ee9ff7135","Type":"ContainerStarted","Data":"e926b898dae9d03dbeb58e8ac816cf43bbf7a3e848d5c3cf841ee2c73fb1a00b"} Apr 22 16:50:07.026367 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:07.026334 2575 generic.go:358] "Generic (PLEG): container finished" podID="b2b1af28-0dae-470b-89c4-aa6ee9ff7135" containerID="1346543e18a5b7868fc6192450687fce2dec4c25cb28269a7bb41d7e7f074b0b" exitCode=0 Apr 22 16:50:07.026747 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:07.026409 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" event={"ID":"b2b1af28-0dae-470b-89c4-aa6ee9ff7135","Type":"ContainerDied","Data":"1346543e18a5b7868fc6192450687fce2dec4c25cb28269a7bb41d7e7f074b0b"} Apr 22 16:50:08.031331 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:08.031300 2575 generic.go:358] "Generic (PLEG): container finished" podID="b2b1af28-0dae-470b-89c4-aa6ee9ff7135" containerID="2f08ec0bae6e04aa28160753e1b477420b436ef9622c396b41b2cb4fe4b80db0" exitCode=0 Apr 22 16:50:08.031677 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:08.031365 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" event={"ID":"b2b1af28-0dae-470b-89c4-aa6ee9ff7135","Type":"ContainerDied","Data":"2f08ec0bae6e04aa28160753e1b477420b436ef9622c396b41b2cb4fe4b80db0"} Apr 22 16:50:09.151251 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:09.151230 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" Apr 22 16:50:09.261249 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:09.261218 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-bundle\") pod \"b2b1af28-0dae-470b-89c4-aa6ee9ff7135\" (UID: \"b2b1af28-0dae-470b-89c4-aa6ee9ff7135\") " Apr 22 16:50:09.261397 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:09.261294 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9glg\" (UniqueName: \"kubernetes.io/projected/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-kube-api-access-v9glg\") pod \"b2b1af28-0dae-470b-89c4-aa6ee9ff7135\" (UID: \"b2b1af28-0dae-470b-89c4-aa6ee9ff7135\") " Apr 22 16:50:09.261397 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:09.261319 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-util\") pod \"b2b1af28-0dae-470b-89c4-aa6ee9ff7135\" (UID: \"b2b1af28-0dae-470b-89c4-aa6ee9ff7135\") " Apr 22 16:50:09.262178 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:09.262149 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-bundle" (OuterVolumeSpecName: "bundle") pod "b2b1af28-0dae-470b-89c4-aa6ee9ff7135" (UID: "b2b1af28-0dae-470b-89c4-aa6ee9ff7135"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:50:09.263467 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:09.263449 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-kube-api-access-v9glg" (OuterVolumeSpecName: "kube-api-access-v9glg") pod "b2b1af28-0dae-470b-89c4-aa6ee9ff7135" (UID: "b2b1af28-0dae-470b-89c4-aa6ee9ff7135"). InnerVolumeSpecName "kube-api-access-v9glg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:50:09.267009 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:09.266984 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-util" (OuterVolumeSpecName: "util") pod "b2b1af28-0dae-470b-89c4-aa6ee9ff7135" (UID: "b2b1af28-0dae-470b-89c4-aa6ee9ff7135"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:50:09.362106 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:09.361987 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-bundle\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:50:09.362106 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:09.362019 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v9glg\" (UniqueName: \"kubernetes.io/projected/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-kube-api-access-v9glg\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:50:09.362106 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:09.362066 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2b1af28-0dae-470b-89c4-aa6ee9ff7135-util\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:50:10.039928 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:10.039903 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" Apr 22 16:50:10.040108 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:10.039906 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wr22p" event={"ID":"b2b1af28-0dae-470b-89c4-aa6ee9ff7135","Type":"ContainerDied","Data":"e926b898dae9d03dbeb58e8ac816cf43bbf7a3e848d5c3cf841ee2c73fb1a00b"} Apr 22 16:50:10.040108 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:10.040007 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e926b898dae9d03dbeb58e8ac816cf43bbf7a3e848d5c3cf841ee2c73fb1a00b" Apr 22 16:50:18.568307 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.568276 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4"] Apr 22 16:50:18.568655 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.568631 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2b1af28-0dae-470b-89c4-aa6ee9ff7135" containerName="pull" Apr 22 16:50:18.568655 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.568642 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b1af28-0dae-470b-89c4-aa6ee9ff7135" containerName="pull" Apr 22 16:50:18.568655 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.568653 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2b1af28-0dae-470b-89c4-aa6ee9ff7135" containerName="util" Apr 22 16:50:18.568760 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.568659 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b1af28-0dae-470b-89c4-aa6ee9ff7135" containerName="util" Apr 22 16:50:18.568760 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.568672 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2b1af28-0dae-470b-89c4-aa6ee9ff7135" containerName="extract" Apr 22 16:50:18.568760 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.568677 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b1af28-0dae-470b-89c4-aa6ee9ff7135" containerName="extract" Apr 22 16:50:18.568760 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.568748 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2b1af28-0dae-470b-89c4-aa6ee9ff7135" containerName="extract" Apr 22 16:50:18.573285 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.573264 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" Apr 22 16:50:18.577327 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.577304 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 16:50:18.577434 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.577351 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 16:50:18.578373 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.578354 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-lwknc\"" Apr 22 16:50:18.596823 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.596789 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4"] Apr 22 16:50:18.641169 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.641145 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9108e597-5415-4741-a226-72b8c76d650c-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4\" (UID: \"9108e597-5415-4741-a226-72b8c76d650c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" Apr 22 16:50:18.641304 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.641189 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9108e597-5415-4741-a226-72b8c76d650c-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4\" (UID: \"9108e597-5415-4741-a226-72b8c76d650c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" Apr 22 16:50:18.641368 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.641316 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n4ff\" (UniqueName: \"kubernetes.io/projected/9108e597-5415-4741-a226-72b8c76d650c-kube-api-access-8n4ff\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4\" (UID: \"9108e597-5415-4741-a226-72b8c76d650c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" Apr 22 16:50:18.742215 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.742177 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8n4ff\" (UniqueName: \"kubernetes.io/projected/9108e597-5415-4741-a226-72b8c76d650c-kube-api-access-8n4ff\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4\" (UID: \"9108e597-5415-4741-a226-72b8c76d650c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" Apr 22 16:50:18.742375 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.742223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9108e597-5415-4741-a226-72b8c76d650c-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4\" (UID: \"9108e597-5415-4741-a226-72b8c76d650c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" Apr 22 16:50:18.742375 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.742248 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9108e597-5415-4741-a226-72b8c76d650c-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4\" (UID: \"9108e597-5415-4741-a226-72b8c76d650c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" Apr 22 16:50:18.742612 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.742592 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9108e597-5415-4741-a226-72b8c76d650c-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4\" (UID: \"9108e597-5415-4741-a226-72b8c76d650c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" Apr 22 16:50:18.742672 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.742600 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9108e597-5415-4741-a226-72b8c76d650c-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4\" (UID: \"9108e597-5415-4741-a226-72b8c76d650c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" Apr 22 16:50:18.765005 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.764979 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n4ff\" (UniqueName: \"kubernetes.io/projected/9108e597-5415-4741-a226-72b8c76d650c-kube-api-access-8n4ff\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4\" (UID: \"9108e597-5415-4741-a226-72b8c76d650c\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" Apr 22 16:50:18.882845 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:18.882762 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" Apr 22 16:50:19.010550 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:19.010522 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4"] Apr 22 16:50:19.012585 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:50:19.012552 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9108e597_5415_4741_a226_72b8c76d650c.slice/crio-99373b93f4004a8473c50ed97bba2dfee51bc868b418263a34bfd7e0a3b20849 WatchSource:0}: Error finding container 99373b93f4004a8473c50ed97bba2dfee51bc868b418263a34bfd7e0a3b20849: Status 404 returned error can't find the container with id 99373b93f4004a8473c50ed97bba2dfee51bc868b418263a34bfd7e0a3b20849 Apr 22 16:50:19.072414 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:19.072385 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" event={"ID":"9108e597-5415-4741-a226-72b8c76d650c","Type":"ContainerStarted","Data":"99373b93f4004a8473c50ed97bba2dfee51bc868b418263a34bfd7e0a3b20849"} Apr 22 16:50:20.077463 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:20.077369 2575 generic.go:358] "Generic (PLEG): container finished" podID="9108e597-5415-4741-a226-72b8c76d650c" containerID="16d18face4395e44ed3bf39bb792d229f383891e707b92c723252b6c30a4eb5a" exitCode=0 Apr 22 16:50:20.077812 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:20.077456 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" event={"ID":"9108e597-5415-4741-a226-72b8c76d650c","Type":"ContainerDied","Data":"16d18face4395e44ed3bf39bb792d229f383891e707b92c723252b6c30a4eb5a"} Apr 22 16:50:21.083254 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:21.083223 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" event={"ID":"9108e597-5415-4741-a226-72b8c76d650c","Type":"ContainerStarted","Data":"d6a825e4b0370309c3297c9a3eee3420f2f04b3834d99713ce96483de5fa8b0e"} Apr 22 16:50:22.088652 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:22.088615 2575 generic.go:358] "Generic (PLEG): container finished" podID="9108e597-5415-4741-a226-72b8c76d650c" containerID="d6a825e4b0370309c3297c9a3eee3420f2f04b3834d99713ce96483de5fa8b0e" exitCode=0 Apr 22 16:50:22.089107 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:22.088759 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" event={"ID":"9108e597-5415-4741-a226-72b8c76d650c","Type":"ContainerDied","Data":"d6a825e4b0370309c3297c9a3eee3420f2f04b3834d99713ce96483de5fa8b0e"} Apr 22 16:50:23.094235 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:23.094199 2575 generic.go:358] "Generic (PLEG): container finished" podID="9108e597-5415-4741-a226-72b8c76d650c" containerID="303c907158e6a45ef9763d5926c40eda6f00b67b756325110bb7c416f61a5881" exitCode=0 Apr 22 16:50:23.094586 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:23.094245 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" event={"ID":"9108e597-5415-4741-a226-72b8c76d650c","Type":"ContainerDied","Data":"303c907158e6a45ef9763d5926c40eda6f00b67b756325110bb7c416f61a5881"} Apr 22 16:50:24.226407 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:24.226383 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" Apr 22 16:50:24.289771 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:24.289748 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n4ff\" (UniqueName: \"kubernetes.io/projected/9108e597-5415-4741-a226-72b8c76d650c-kube-api-access-8n4ff\") pod \"9108e597-5415-4741-a226-72b8c76d650c\" (UID: \"9108e597-5415-4741-a226-72b8c76d650c\") " Apr 22 16:50:24.289901 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:24.289802 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9108e597-5415-4741-a226-72b8c76d650c-bundle\") pod \"9108e597-5415-4741-a226-72b8c76d650c\" (UID: \"9108e597-5415-4741-a226-72b8c76d650c\") " Apr 22 16:50:24.289901 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:24.289820 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9108e597-5415-4741-a226-72b8c76d650c-util\") pod \"9108e597-5415-4741-a226-72b8c76d650c\" (UID: \"9108e597-5415-4741-a226-72b8c76d650c\") " Apr 22 16:50:24.290752 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:24.290730 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9108e597-5415-4741-a226-72b8c76d650c-bundle" (OuterVolumeSpecName: "bundle") pod "9108e597-5415-4741-a226-72b8c76d650c" (UID: "9108e597-5415-4741-a226-72b8c76d650c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:50:24.291932 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:24.291906 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9108e597-5415-4741-a226-72b8c76d650c-kube-api-access-8n4ff" (OuterVolumeSpecName: "kube-api-access-8n4ff") pod "9108e597-5415-4741-a226-72b8c76d650c" (UID: "9108e597-5415-4741-a226-72b8c76d650c"). InnerVolumeSpecName "kube-api-access-8n4ff". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:50:24.391497 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:24.391412 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8n4ff\" (UniqueName: \"kubernetes.io/projected/9108e597-5415-4741-a226-72b8c76d650c-kube-api-access-8n4ff\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:50:24.391497 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:24.391450 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9108e597-5415-4741-a226-72b8c76d650c-bundle\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:50:24.921124 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:24.921087 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9108e597-5415-4741-a226-72b8c76d650c-util" (OuterVolumeSpecName: "util") pod "9108e597-5415-4741-a226-72b8c76d650c" (UID: "9108e597-5415-4741-a226-72b8c76d650c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:50:24.996224 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:24.996193 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9108e597-5415-4741-a226-72b8c76d650c-util\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:50:25.102517 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:25.102486 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" Apr 22 16:50:25.102676 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:25.102489 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebm5lj4" event={"ID":"9108e597-5415-4741-a226-72b8c76d650c","Type":"ContainerDied","Data":"99373b93f4004a8473c50ed97bba2dfee51bc868b418263a34bfd7e0a3b20849"} Apr 22 16:50:25.102676 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:25.102594 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99373b93f4004a8473c50ed97bba2dfee51bc868b418263a34bfd7e0a3b20849" Apr 22 16:50:34.103340 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.103307 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn"] Apr 22 16:50:34.103817 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.103795 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9108e597-5415-4741-a226-72b8c76d650c" containerName="util" Apr 22 16:50:34.103875 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.103818 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9108e597-5415-4741-a226-72b8c76d650c" containerName="util" Apr 22 16:50:34.103875 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.103827 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9108e597-5415-4741-a226-72b8c76d650c" containerName="pull" Apr 22 16:50:34.103875 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.103832 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9108e597-5415-4741-a226-72b8c76d650c" containerName="pull" Apr 22 16:50:34.103875 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.103847 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9108e597-5415-4741-a226-72b8c76d650c" containerName="extract" Apr 22 16:50:34.103875 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.103852 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9108e597-5415-4741-a226-72b8c76d650c" containerName="extract" Apr 22 16:50:34.104144 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.103906 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="9108e597-5415-4741-a226-72b8c76d650c" containerName="extract" Apr 22 16:50:34.112476 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.112457 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.114889 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.114869 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 16:50:34.114987 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.114960 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-czkrh\"" Apr 22 16:50:34.119896 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.119872 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn"] Apr 22 16:50:34.277187 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.277154 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.277187 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.277192 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9ss5\" (UniqueName: \"kubernetes.io/projected/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-kube-api-access-m9ss5\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.277409 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.277219 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.277409 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.277238 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.277409 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.277331 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.277409 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.277370 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.277409 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.277400 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.277621 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.277416 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.277621 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.277431 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.378665 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.378586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.378665 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.378623 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9ss5\" (UniqueName: \"kubernetes.io/projected/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-kube-api-access-m9ss5\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.378665 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.378650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.378912 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.378891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.378979 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.378956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.379032 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.379019 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.379147 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.379104 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.379147 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.379136 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.379357 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.379157 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.379357 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.379170 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.379357 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.379211 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.379357 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.379323 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.379577 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.379375 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.379758 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.379737 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.381340 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.381317 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.381576 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.381558 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.388989 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.388969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.389128 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.389107 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9ss5\" (UniqueName: \"kubernetes.io/projected/4c0a40a6-c1b0-4337-a99a-e73afe30deb9-kube-api-access-m9ss5\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fzjdfn\" (UID: \"4c0a40a6-c1b0-4337-a99a-e73afe30deb9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.424126 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.424085 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:34.551788 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:34.551755 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn"] Apr 22 16:50:34.555290 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:50:34.555261 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c0a40a6_c1b0_4337_a99a_e73afe30deb9.slice/crio-4299b9ce046e7595ecdf45b380f1e9a65782fd762aaee1130e771d46a4b915b4 WatchSource:0}: Error finding container 4299b9ce046e7595ecdf45b380f1e9a65782fd762aaee1130e771d46a4b915b4: Status 404 returned error can't find the container with id 4299b9ce046e7595ecdf45b380f1e9a65782fd762aaee1130e771d46a4b915b4 Apr 22 16:50:35.138517 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:35.138479 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" event={"ID":"4c0a40a6-c1b0-4337-a99a-e73afe30deb9","Type":"ContainerStarted","Data":"4299b9ce046e7595ecdf45b380f1e9a65782fd762aaee1130e771d46a4b915b4"} Apr 22 16:50:37.042364 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:37.042326 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 22 16:50:37.042592 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:37.042425 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 22 16:50:37.042592 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:37.042461 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 22 16:50:37.149496 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:37.149460 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" event={"ID":"4c0a40a6-c1b0-4337-a99a-e73afe30deb9","Type":"ContainerStarted","Data":"50ea34a9c19ef95102b2f85b0437a84428730c452b04a7976fa5feac9dd9a933"} Apr 22 16:50:37.170904 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:37.170851 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" podStartSLOduration=0.686146237 podStartE2EDuration="3.170837179s" podCreationTimestamp="2026-04-22 16:50:34 +0000 UTC" firstStartedPulling="2026-04-22 16:50:34.557318297 +0000 UTC m=+1734.259336384" lastFinishedPulling="2026-04-22 16:50:37.042009238 +0000 UTC m=+1736.744027326" observedRunningTime="2026-04-22 16:50:37.17008583 +0000 UTC m=+1736.872103940" watchObservedRunningTime="2026-04-22 16:50:37.170837179 +0000 UTC m=+1736.872855292" Apr 22 16:50:37.424701 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:37.424631 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:38.429973 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:38.429944 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:39.157082 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:39.157034 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:50:39.157913 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:50:39.157889 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fzjdfn" Apr 22 16:51:05.061750 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.061709 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m6wzv"] Apr 22 16:51:05.067082 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.067064 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-m6wzv" Apr 22 16:51:05.069651 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.069622 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 16:51:05.069765 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.069669 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-zh29m\"" Apr 22 16:51:05.070539 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.070522 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 16:51:05.072419 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.072400 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m6wzv"] Apr 22 16:51:05.145154 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.145119 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6thh9\" (UniqueName: \"kubernetes.io/projected/bc5f4f07-e97c-4448-a134-8aec3f0bef3e-kube-api-access-6thh9\") pod \"kuadrant-operator-catalog-m6wzv\" (UID: \"bc5f4f07-e97c-4448-a134-8aec3f0bef3e\") " pod="kuadrant-system/kuadrant-operator-catalog-m6wzv" Apr 22 16:51:05.245837 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.245801 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6thh9\" (UniqueName: \"kubernetes.io/projected/bc5f4f07-e97c-4448-a134-8aec3f0bef3e-kube-api-access-6thh9\") pod \"kuadrant-operator-catalog-m6wzv\" (UID: \"bc5f4f07-e97c-4448-a134-8aec3f0bef3e\") " pod="kuadrant-system/kuadrant-operator-catalog-m6wzv" Apr 22 16:51:05.253466 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.253445 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6thh9\" (UniqueName: \"kubernetes.io/projected/bc5f4f07-e97c-4448-a134-8aec3f0bef3e-kube-api-access-6thh9\") pod \"kuadrant-operator-catalog-m6wzv\" (UID: \"bc5f4f07-e97c-4448-a134-8aec3f0bef3e\") " pod="kuadrant-system/kuadrant-operator-catalog-m6wzv" Apr 22 16:51:05.378236 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.378161 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-m6wzv" Apr 22 16:51:05.432843 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.432790 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m6wzv"] Apr 22 16:51:05.499297 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.499273 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m6wzv"] Apr 22 16:51:05.501338 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:51:05.501311 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc5f4f07_e97c_4448_a134_8aec3f0bef3e.slice/crio-02401b96400d99f13521cac4d5b1cc9b1d388d4af1932029c992828a3b636c67 WatchSource:0}: Error finding container 02401b96400d99f13521cac4d5b1cc9b1d388d4af1932029c992828a3b636c67: Status 404 returned error can't find the container with id 02401b96400d99f13521cac4d5b1cc9b1d388d4af1932029c992828a3b636c67 Apr 22 16:51:05.642983 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.642921 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-7fpqw"] Apr 22 16:51:05.647804 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.647788 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-7fpqw" Apr 22 16:51:05.652374 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.652351 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-7fpqw"] Apr 22 16:51:05.748940 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.748906 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4lgt\" (UniqueName: \"kubernetes.io/projected/e827b06d-94a9-4295-a945-985a14cbb42a-kube-api-access-s4lgt\") pod \"kuadrant-operator-catalog-7fpqw\" (UID: \"e827b06d-94a9-4295-a945-985a14cbb42a\") " pod="kuadrant-system/kuadrant-operator-catalog-7fpqw" Apr 22 16:51:05.849859 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.849830 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4lgt\" (UniqueName: \"kubernetes.io/projected/e827b06d-94a9-4295-a945-985a14cbb42a-kube-api-access-s4lgt\") pod \"kuadrant-operator-catalog-7fpqw\" (UID: \"e827b06d-94a9-4295-a945-985a14cbb42a\") " pod="kuadrant-system/kuadrant-operator-catalog-7fpqw" Apr 22 16:51:05.857298 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.857278 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4lgt\" (UniqueName: \"kubernetes.io/projected/e827b06d-94a9-4295-a945-985a14cbb42a-kube-api-access-s4lgt\") pod \"kuadrant-operator-catalog-7fpqw\" (UID: \"e827b06d-94a9-4295-a945-985a14cbb42a\") " pod="kuadrant-system/kuadrant-operator-catalog-7fpqw" Apr 22 16:51:05.958151 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:05.958120 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-7fpqw" Apr 22 16:51:06.086131 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:06.086106 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-7fpqw"] Apr 22 16:51:06.088177 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:51:06.088148 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode827b06d_94a9_4295_a945_985a14cbb42a.slice/crio-6c0fd12ee1f3c0615682d54a46fb9c152fc9570c19d09acbb222f3c155a79e42 WatchSource:0}: Error finding container 6c0fd12ee1f3c0615682d54a46fb9c152fc9570c19d09acbb222f3c155a79e42: Status 404 returned error can't find the container with id 6c0fd12ee1f3c0615682d54a46fb9c152fc9570c19d09acbb222f3c155a79e42 Apr 22 16:51:06.252536 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:06.252464 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-m6wzv" event={"ID":"bc5f4f07-e97c-4448-a134-8aec3f0bef3e","Type":"ContainerStarted","Data":"02401b96400d99f13521cac4d5b1cc9b1d388d4af1932029c992828a3b636c67"} Apr 22 16:51:06.253715 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:06.253683 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-7fpqw" event={"ID":"e827b06d-94a9-4295-a945-985a14cbb42a","Type":"ContainerStarted","Data":"6c0fd12ee1f3c0615682d54a46fb9c152fc9570c19d09acbb222f3c155a79e42"} Apr 22 16:51:08.264145 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:08.264102 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-m6wzv" event={"ID":"bc5f4f07-e97c-4448-a134-8aec3f0bef3e","Type":"ContainerStarted","Data":"05a9d9b68c9f56ce2ba5534ef6aa91a5664c23fef72caf95a59a8c24d3586bc9"} Apr 22 16:51:08.264587 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:08.264163 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-m6wzv" podUID="bc5f4f07-e97c-4448-a134-8aec3f0bef3e" containerName="registry-server" containerID="cri-o://05a9d9b68c9f56ce2ba5534ef6aa91a5664c23fef72caf95a59a8c24d3586bc9" gracePeriod=2 Apr 22 16:51:08.266230 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:08.266202 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-7fpqw" event={"ID":"e827b06d-94a9-4295-a945-985a14cbb42a","Type":"ContainerStarted","Data":"f34f063c06081f74ff8569c82b82f7e29aa57852a8373a7fddb61488a1a8893b"} Apr 22 16:51:08.280798 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:08.280757 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-m6wzv" podStartSLOduration=0.914213829 podStartE2EDuration="3.280744705s" podCreationTimestamp="2026-04-22 16:51:05 +0000 UTC" firstStartedPulling="2026-04-22 16:51:05.503035178 +0000 UTC m=+1765.205053269" lastFinishedPulling="2026-04-22 16:51:07.869566056 +0000 UTC m=+1767.571584145" observedRunningTime="2026-04-22 16:51:08.277715315 +0000 UTC m=+1767.979733423" watchObservedRunningTime="2026-04-22 16:51:08.280744705 +0000 UTC m=+1767.982762813" Apr 22 16:51:08.294769 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:08.294721 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-7fpqw" podStartSLOduration=1.511727096 podStartE2EDuration="3.294710169s" podCreationTimestamp="2026-04-22 16:51:05 +0000 UTC" firstStartedPulling="2026-04-22 16:51:06.089591662 +0000 UTC m=+1765.791609750" lastFinishedPulling="2026-04-22 16:51:07.872574732 +0000 UTC m=+1767.574592823" observedRunningTime="2026-04-22 16:51:08.292622268 +0000 UTC m=+1767.994640387" watchObservedRunningTime="2026-04-22 16:51:08.294710169 +0000 UTC m=+1767.996728294" Apr 22 16:51:08.501812 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:08.501790 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-m6wzv" Apr 22 16:51:08.573382 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:08.573304 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6thh9\" (UniqueName: \"kubernetes.io/projected/bc5f4f07-e97c-4448-a134-8aec3f0bef3e-kube-api-access-6thh9\") pod \"bc5f4f07-e97c-4448-a134-8aec3f0bef3e\" (UID: \"bc5f4f07-e97c-4448-a134-8aec3f0bef3e\") " Apr 22 16:51:08.575575 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:08.575552 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5f4f07-e97c-4448-a134-8aec3f0bef3e-kube-api-access-6thh9" (OuterVolumeSpecName: "kube-api-access-6thh9") pod "bc5f4f07-e97c-4448-a134-8aec3f0bef3e" (UID: "bc5f4f07-e97c-4448-a134-8aec3f0bef3e"). InnerVolumeSpecName "kube-api-access-6thh9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:51:08.674260 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:08.674226 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6thh9\" (UniqueName: \"kubernetes.io/projected/bc5f4f07-e97c-4448-a134-8aec3f0bef3e-kube-api-access-6thh9\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:51:09.270716 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:09.270681 2575 generic.go:358] "Generic (PLEG): container finished" podID="bc5f4f07-e97c-4448-a134-8aec3f0bef3e" containerID="05a9d9b68c9f56ce2ba5534ef6aa91a5664c23fef72caf95a59a8c24d3586bc9" exitCode=0 Apr 22 16:51:09.271122 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:09.270772 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-m6wzv" Apr 22 16:51:09.271122 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:09.270770 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-m6wzv" event={"ID":"bc5f4f07-e97c-4448-a134-8aec3f0bef3e","Type":"ContainerDied","Data":"05a9d9b68c9f56ce2ba5534ef6aa91a5664c23fef72caf95a59a8c24d3586bc9"} Apr 22 16:51:09.271122 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:09.270816 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-m6wzv" event={"ID":"bc5f4f07-e97c-4448-a134-8aec3f0bef3e","Type":"ContainerDied","Data":"02401b96400d99f13521cac4d5b1cc9b1d388d4af1932029c992828a3b636c67"} Apr 22 16:51:09.271122 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:09.270831 2575 scope.go:117] "RemoveContainer" containerID="05a9d9b68c9f56ce2ba5534ef6aa91a5664c23fef72caf95a59a8c24d3586bc9" Apr 22 16:51:09.279652 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:09.279634 2575 scope.go:117] "RemoveContainer" containerID="05a9d9b68c9f56ce2ba5534ef6aa91a5664c23fef72caf95a59a8c24d3586bc9" Apr 22 16:51:09.279937 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:51:09.279914 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05a9d9b68c9f56ce2ba5534ef6aa91a5664c23fef72caf95a59a8c24d3586bc9\": container with ID starting with 05a9d9b68c9f56ce2ba5534ef6aa91a5664c23fef72caf95a59a8c24d3586bc9 not found: ID does not exist" containerID="05a9d9b68c9f56ce2ba5534ef6aa91a5664c23fef72caf95a59a8c24d3586bc9" Apr 22 16:51:09.280004 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:09.279951 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05a9d9b68c9f56ce2ba5534ef6aa91a5664c23fef72caf95a59a8c24d3586bc9"} err="failed to get container status \"05a9d9b68c9f56ce2ba5534ef6aa91a5664c23fef72caf95a59a8c24d3586bc9\": rpc error: code = NotFound desc = could not find container \"05a9d9b68c9f56ce2ba5534ef6aa91a5664c23fef72caf95a59a8c24d3586bc9\": container with ID starting with 05a9d9b68c9f56ce2ba5534ef6aa91a5664c23fef72caf95a59a8c24d3586bc9 not found: ID does not exist" Apr 22 16:51:09.286956 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:09.286935 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m6wzv"] Apr 22 16:51:09.289915 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:09.289894 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m6wzv"] Apr 22 16:51:10.905623 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:10.905591 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5f4f07-e97c-4448-a134-8aec3f0bef3e" path="/var/lib/kubelet/pods/bc5f4f07-e97c-4448-a134-8aec3f0bef3e/volumes" Apr 22 16:51:15.958583 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:15.958550 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-7fpqw" Apr 22 16:51:15.959077 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:15.958773 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-7fpqw" Apr 22 16:51:15.979951 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:15.979922 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-7fpqw" Apr 22 16:51:16.319197 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:16.319128 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-7fpqw" Apr 22 16:51:20.874024 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:20.873993 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v"] Apr 22 16:51:20.874426 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:20.874359 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc5f4f07-e97c-4448-a134-8aec3f0bef3e" containerName="registry-server" Apr 22 16:51:20.874426 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:20.874372 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5f4f07-e97c-4448-a134-8aec3f0bef3e" containerName="registry-server" Apr 22 16:51:20.874506 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:20.874430 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc5f4f07-e97c-4448-a134-8aec3f0bef3e" containerName="registry-server" Apr 22 16:51:20.878961 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:20.878943 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" Apr 22 16:51:20.882406 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:20.882383 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9455s\"" Apr 22 16:51:20.884219 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:20.884198 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v"] Apr 22 16:51:20.977763 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:20.977725 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/590bf948-b178-4415-b16b-a6339ab50c2c-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v\" (UID: \"590bf948-b178-4415-b16b-a6339ab50c2c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" Apr 22 16:51:20.977967 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:20.977852 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k47sl\" (UniqueName: \"kubernetes.io/projected/590bf948-b178-4415-b16b-a6339ab50c2c-kube-api-access-k47sl\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v\" (UID: \"590bf948-b178-4415-b16b-a6339ab50c2c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" Apr 22 16:51:20.977967 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:20.977908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/590bf948-b178-4415-b16b-a6339ab50c2c-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v\" (UID: \"590bf948-b178-4415-b16b-a6339ab50c2c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" Apr 22 16:51:21.079169 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.079137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/590bf948-b178-4415-b16b-a6339ab50c2c-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v\" (UID: \"590bf948-b178-4415-b16b-a6339ab50c2c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" Apr 22 16:51:21.079332 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.079194 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k47sl\" (UniqueName: \"kubernetes.io/projected/590bf948-b178-4415-b16b-a6339ab50c2c-kube-api-access-k47sl\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v\" (UID: \"590bf948-b178-4415-b16b-a6339ab50c2c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" Apr 22 16:51:21.079332 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.079226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/590bf948-b178-4415-b16b-a6339ab50c2c-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v\" (UID: \"590bf948-b178-4415-b16b-a6339ab50c2c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" Apr 22 16:51:21.079559 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.079534 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/590bf948-b178-4415-b16b-a6339ab50c2c-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v\" (UID: \"590bf948-b178-4415-b16b-a6339ab50c2c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" Apr 22 16:51:21.079653 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.079626 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/590bf948-b178-4415-b16b-a6339ab50c2c-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v\" (UID: \"590bf948-b178-4415-b16b-a6339ab50c2c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" Apr 22 16:51:21.087492 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.087467 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k47sl\" (UniqueName: \"kubernetes.io/projected/590bf948-b178-4415-b16b-a6339ab50c2c-kube-api-access-k47sl\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v\" (UID: \"590bf948-b178-4415-b16b-a6339ab50c2c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" Apr 22 16:51:21.189224 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.189193 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" Apr 22 16:51:21.517796 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.517771 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v"] Apr 22 16:51:21.519435 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:51:21.519397 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod590bf948_b178_4415_b16b_a6339ab50c2c.slice/crio-fdca5b9f433cd67f376987b968f6b843df3bc00c75f97dde40562d4083532b43 WatchSource:0}: Error finding container fdca5b9f433cd67f376987b968f6b843df3bc00c75f97dde40562d4083532b43: Status 404 returned error can't find the container with id fdca5b9f433cd67f376987b968f6b843df3bc00c75f97dde40562d4083532b43 Apr 22 16:51:21.673123 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.673095 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k"] Apr 22 16:51:21.675754 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.675732 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" Apr 22 16:51:21.683344 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.683321 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k"] Apr 22 16:51:21.786003 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.785914 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a6b497b-a168-4b3d-b259-07fc47a07416-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k\" (UID: \"9a6b497b-a168-4b3d-b259-07fc47a07416\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" Apr 22 16:51:21.786003 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.785959 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a6b497b-a168-4b3d-b259-07fc47a07416-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k\" (UID: \"9a6b497b-a168-4b3d-b259-07fc47a07416\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" Apr 22 16:51:21.786213 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.786076 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbmhg\" (UniqueName: \"kubernetes.io/projected/9a6b497b-a168-4b3d-b259-07fc47a07416-kube-api-access-sbmhg\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k\" (UID: \"9a6b497b-a168-4b3d-b259-07fc47a07416\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" Apr 22 16:51:21.887343 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.887308 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbmhg\" (UniqueName: \"kubernetes.io/projected/9a6b497b-a168-4b3d-b259-07fc47a07416-kube-api-access-sbmhg\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k\" (UID: \"9a6b497b-a168-4b3d-b259-07fc47a07416\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" Apr 22 16:51:21.887710 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.887375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a6b497b-a168-4b3d-b259-07fc47a07416-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k\" (UID: \"9a6b497b-a168-4b3d-b259-07fc47a07416\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" Apr 22 16:51:21.887710 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.887408 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a6b497b-a168-4b3d-b259-07fc47a07416-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k\" (UID: \"9a6b497b-a168-4b3d-b259-07fc47a07416\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" Apr 22 16:51:21.887841 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.887820 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a6b497b-a168-4b3d-b259-07fc47a07416-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k\" (UID: \"9a6b497b-a168-4b3d-b259-07fc47a07416\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" Apr 22 16:51:21.887876 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.887831 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a6b497b-a168-4b3d-b259-07fc47a07416-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k\" (UID: \"9a6b497b-a168-4b3d-b259-07fc47a07416\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" Apr 22 16:51:21.895800 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.895779 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbmhg\" (UniqueName: \"kubernetes.io/projected/9a6b497b-a168-4b3d-b259-07fc47a07416-kube-api-access-sbmhg\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k\" (UID: \"9a6b497b-a168-4b3d-b259-07fc47a07416\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" Apr 22 16:51:21.985449 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:21.985424 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" Apr 22 16:51:22.091374 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.091338 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x"] Apr 22 16:51:22.095524 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.095499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" Apr 22 16:51:22.101589 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.101566 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x"] Apr 22 16:51:22.111632 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.111607 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k"] Apr 22 16:51:22.112067 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:51:22.112031 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a6b497b_a168_4b3d_b259_07fc47a07416.slice/crio-1548ee50559fddc2bbcd411fd7989bd7c3d9319e15b0e503fb126b9b31fd69ea WatchSource:0}: Error finding container 1548ee50559fddc2bbcd411fd7989bd7c3d9319e15b0e503fb126b9b31fd69ea: Status 404 returned error can't find the container with id 1548ee50559fddc2bbcd411fd7989bd7c3d9319e15b0e503fb126b9b31fd69ea Apr 22 16:51:22.190443 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.190406 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18d8b17a-1897-4266-9555-e6c7d2803a15-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x\" (UID: \"18d8b17a-1897-4266-9555-e6c7d2803a15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" Apr 22 16:51:22.190594 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.190448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6gf2\" (UniqueName: \"kubernetes.io/projected/18d8b17a-1897-4266-9555-e6c7d2803a15-kube-api-access-c6gf2\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x\" (UID: \"18d8b17a-1897-4266-9555-e6c7d2803a15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" Apr 22 16:51:22.190594 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.190513 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18d8b17a-1897-4266-9555-e6c7d2803a15-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x\" (UID: \"18d8b17a-1897-4266-9555-e6c7d2803a15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" Apr 22 16:51:22.291456 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.291408 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18d8b17a-1897-4266-9555-e6c7d2803a15-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x\" (UID: \"18d8b17a-1897-4266-9555-e6c7d2803a15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" Apr 22 16:51:22.291456 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.291458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6gf2\" (UniqueName: \"kubernetes.io/projected/18d8b17a-1897-4266-9555-e6c7d2803a15-kube-api-access-c6gf2\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x\" (UID: \"18d8b17a-1897-4266-9555-e6c7d2803a15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" Apr 22 16:51:22.291690 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.291492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18d8b17a-1897-4266-9555-e6c7d2803a15-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x\" (UID: \"18d8b17a-1897-4266-9555-e6c7d2803a15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" Apr 22 16:51:22.291875 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.291852 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18d8b17a-1897-4266-9555-e6c7d2803a15-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x\" (UID: \"18d8b17a-1897-4266-9555-e6c7d2803a15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" Apr 22 16:51:22.291945 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.291861 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18d8b17a-1897-4266-9555-e6c7d2803a15-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x\" (UID: \"18d8b17a-1897-4266-9555-e6c7d2803a15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" Apr 22 16:51:22.300130 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.300103 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6gf2\" (UniqueName: \"kubernetes.io/projected/18d8b17a-1897-4266-9555-e6c7d2803a15-kube-api-access-c6gf2\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x\" (UID: \"18d8b17a-1897-4266-9555-e6c7d2803a15\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" Apr 22 16:51:22.326718 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.326643 2575 generic.go:358] "Generic (PLEG): container finished" podID="590bf948-b178-4415-b16b-a6339ab50c2c" containerID="5781453bbcdc91cdba87f1e596c002116e1b0a290a2e2e58d3c37dde9a8e7803" exitCode=0 Apr 22 16:51:22.326828 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.326710 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" event={"ID":"590bf948-b178-4415-b16b-a6339ab50c2c","Type":"ContainerDied","Data":"5781453bbcdc91cdba87f1e596c002116e1b0a290a2e2e58d3c37dde9a8e7803"} Apr 22 16:51:22.326828 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.326740 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" event={"ID":"590bf948-b178-4415-b16b-a6339ab50c2c","Type":"ContainerStarted","Data":"fdca5b9f433cd67f376987b968f6b843df3bc00c75f97dde40562d4083532b43"} Apr 22 16:51:22.328209 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.328181 2575 generic.go:358] "Generic (PLEG): container finished" podID="9a6b497b-a168-4b3d-b259-07fc47a07416" containerID="d49737bcc9084b284652c2f273d7d1288ec6e1f39967295ce9f866db1e6e2e89" exitCode=0 Apr 22 16:51:22.328280 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.328220 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" event={"ID":"9a6b497b-a168-4b3d-b259-07fc47a07416","Type":"ContainerDied","Data":"d49737bcc9084b284652c2f273d7d1288ec6e1f39967295ce9f866db1e6e2e89"} Apr 22 16:51:22.328280 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.328242 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" event={"ID":"9a6b497b-a168-4b3d-b259-07fc47a07416","Type":"ContainerStarted","Data":"1548ee50559fddc2bbcd411fd7989bd7c3d9319e15b0e503fb126b9b31fd69ea"} Apr 22 16:51:22.409237 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.409207 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" Apr 22 16:51:22.475357 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.475323 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr"] Apr 22 16:51:22.478652 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.478631 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" Apr 22 16:51:22.486919 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.486886 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr"] Apr 22 16:51:22.536004 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.535975 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x"] Apr 22 16:51:22.537640 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:51:22.537616 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18d8b17a_1897_4266_9555_e6c7d2803a15.slice/crio-65e0eb338113b75e509cf0c5f8f132a891cc4022ab42630ded56bf0f63483375 WatchSource:0}: Error finding container 65e0eb338113b75e509cf0c5f8f132a891cc4022ab42630ded56bf0f63483375: Status 404 returned error can't find the container with id 65e0eb338113b75e509cf0c5f8f132a891cc4022ab42630ded56bf0f63483375 Apr 22 16:51:22.594389 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.594359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pwzc\" (UniqueName: \"kubernetes.io/projected/8c5454eb-b854-441d-a6c7-44481e739f60-kube-api-access-2pwzc\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr\" (UID: \"8c5454eb-b854-441d-a6c7-44481e739f60\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" Apr 22 16:51:22.594505 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.594396 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8c5454eb-b854-441d-a6c7-44481e739f60-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr\" (UID: \"8c5454eb-b854-441d-a6c7-44481e739f60\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" Apr 22 16:51:22.594505 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.594476 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8c5454eb-b854-441d-a6c7-44481e739f60-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr\" (UID: \"8c5454eb-b854-441d-a6c7-44481e739f60\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" Apr 22 16:51:22.695494 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.695460 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8c5454eb-b854-441d-a6c7-44481e739f60-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr\" (UID: \"8c5454eb-b854-441d-a6c7-44481e739f60\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" Apr 22 16:51:22.695642 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.695543 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8c5454eb-b854-441d-a6c7-44481e739f60-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr\" (UID: \"8c5454eb-b854-441d-a6c7-44481e739f60\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" Apr 22 16:51:22.695642 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.695616 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pwzc\" (UniqueName: \"kubernetes.io/projected/8c5454eb-b854-441d-a6c7-44481e739f60-kube-api-access-2pwzc\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr\" (UID: \"8c5454eb-b854-441d-a6c7-44481e739f60\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" Apr 22 16:51:22.695860 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.695822 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8c5454eb-b854-441d-a6c7-44481e739f60-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr\" (UID: \"8c5454eb-b854-441d-a6c7-44481e739f60\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" Apr 22 16:51:22.695904 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.695865 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8c5454eb-b854-441d-a6c7-44481e739f60-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr\" (UID: \"8c5454eb-b854-441d-a6c7-44481e739f60\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" Apr 22 16:51:22.704583 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.704558 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pwzc\" (UniqueName: \"kubernetes.io/projected/8c5454eb-b854-441d-a6c7-44481e739f60-kube-api-access-2pwzc\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr\" (UID: \"8c5454eb-b854-441d-a6c7-44481e739f60\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" Apr 22 16:51:22.791598 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.791572 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" Apr 22 16:51:22.914536 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:22.914511 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr"] Apr 22 16:51:22.915987 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:51:22.915957 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c5454eb_b854_441d_a6c7_44481e739f60.slice/crio-02acb074295875aecab3d1a80d0ab12e10bce4fbd48f085aff175fa5b79aabb7 WatchSource:0}: Error finding container 02acb074295875aecab3d1a80d0ab12e10bce4fbd48f085aff175fa5b79aabb7: Status 404 returned error can't find the container with id 02acb074295875aecab3d1a80d0ab12e10bce4fbd48f085aff175fa5b79aabb7 Apr 22 16:51:23.333116 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:23.333082 2575 generic.go:358] "Generic (PLEG): container finished" podID="18d8b17a-1897-4266-9555-e6c7d2803a15" containerID="5c39ab489cb6f3188f258cfd7228f438c89d5f3fb29a623b2d6fb56e43694912" exitCode=0 Apr 22 16:51:23.333258 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:23.333175 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" event={"ID":"18d8b17a-1897-4266-9555-e6c7d2803a15","Type":"ContainerDied","Data":"5c39ab489cb6f3188f258cfd7228f438c89d5f3fb29a623b2d6fb56e43694912"} Apr 22 16:51:23.333258 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:23.333215 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" event={"ID":"18d8b17a-1897-4266-9555-e6c7d2803a15","Type":"ContainerStarted","Data":"65e0eb338113b75e509cf0c5f8f132a891cc4022ab42630ded56bf0f63483375"} Apr 22 16:51:23.334822 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:23.334793 2575 generic.go:358] "Generic (PLEG): container finished" podID="9a6b497b-a168-4b3d-b259-07fc47a07416" containerID="597ad9fbcd129307fa13c1eed92dd7c214d963b31b2c725af28af40603aee75f" exitCode=0 Apr 22 16:51:23.334934 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:23.334822 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" event={"ID":"9a6b497b-a168-4b3d-b259-07fc47a07416","Type":"ContainerDied","Data":"597ad9fbcd129307fa13c1eed92dd7c214d963b31b2c725af28af40603aee75f"} Apr 22 16:51:23.336664 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:23.336640 2575 generic.go:358] "Generic (PLEG): container finished" podID="590bf948-b178-4415-b16b-a6339ab50c2c" containerID="4f4da3e072ab39d3b97221d8d48381372513ad8cea81339c973bfadfa9e263d3" exitCode=0 Apr 22 16:51:23.336765 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:23.336706 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" event={"ID":"590bf948-b178-4415-b16b-a6339ab50c2c","Type":"ContainerDied","Data":"4f4da3e072ab39d3b97221d8d48381372513ad8cea81339c973bfadfa9e263d3"} Apr 22 16:51:23.338065 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:23.338019 2575 generic.go:358] "Generic (PLEG): container finished" podID="8c5454eb-b854-441d-a6c7-44481e739f60" containerID="580a7868dd11a741c25db0d79d3da0cf4d2126278c69223e2d71a4aab7196406" exitCode=0 Apr 22 16:51:23.338214 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:23.338064 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" event={"ID":"8c5454eb-b854-441d-a6c7-44481e739f60","Type":"ContainerDied","Data":"580a7868dd11a741c25db0d79d3da0cf4d2126278c69223e2d71a4aab7196406"} Apr 22 16:51:23.338214 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:23.338090 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" event={"ID":"8c5454eb-b854-441d-a6c7-44481e739f60","Type":"ContainerStarted","Data":"02acb074295875aecab3d1a80d0ab12e10bce4fbd48f085aff175fa5b79aabb7"} Apr 22 16:51:24.344437 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:24.344410 2575 generic.go:358] "Generic (PLEG): container finished" podID="9a6b497b-a168-4b3d-b259-07fc47a07416" containerID="680f129402521d40b6a759fdb65979b815d753430aee4869e4c10c09ced4021d" exitCode=0 Apr 22 16:51:24.344818 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:24.344496 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" event={"ID":"9a6b497b-a168-4b3d-b259-07fc47a07416","Type":"ContainerDied","Data":"680f129402521d40b6a759fdb65979b815d753430aee4869e4c10c09ced4021d"} Apr 22 16:51:24.346629 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:24.346603 2575 generic.go:358] "Generic (PLEG): container finished" podID="590bf948-b178-4415-b16b-a6339ab50c2c" containerID="bc22e1e5c9bff97bcfbe811c102a2e8277817f6ef6fabc149c65f32326da9ede" exitCode=0 Apr 22 16:51:24.346854 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:24.346687 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" event={"ID":"590bf948-b178-4415-b16b-a6339ab50c2c","Type":"ContainerDied","Data":"bc22e1e5c9bff97bcfbe811c102a2e8277817f6ef6fabc149c65f32326da9ede"} Apr 22 16:51:25.351952 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.351913 2575 generic.go:358] "Generic (PLEG): container finished" podID="8c5454eb-b854-441d-a6c7-44481e739f60" containerID="b88b89ae8c2374046282c518bb7b13c980de4fda782e7a60c06f91450cb8a155" exitCode=0 Apr 22 16:51:25.352378 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.351992 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" event={"ID":"8c5454eb-b854-441d-a6c7-44481e739f60","Type":"ContainerDied","Data":"b88b89ae8c2374046282c518bb7b13c980de4fda782e7a60c06f91450cb8a155"} Apr 22 16:51:25.353676 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.353652 2575 generic.go:358] "Generic (PLEG): container finished" podID="18d8b17a-1897-4266-9555-e6c7d2803a15" containerID="ebd85a5a8a0c14401ce6e0e01f22770a9edbb4a481d6483c87dcf1a55b45da18" exitCode=0 Apr 22 16:51:25.353812 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.353789 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" event={"ID":"18d8b17a-1897-4266-9555-e6c7d2803a15","Type":"ContainerDied","Data":"ebd85a5a8a0c14401ce6e0e01f22770a9edbb4a481d6483c87dcf1a55b45da18"} Apr 22 16:51:25.501382 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.501353 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" Apr 22 16:51:25.619228 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.619150 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a6b497b-a168-4b3d-b259-07fc47a07416-bundle\") pod \"9a6b497b-a168-4b3d-b259-07fc47a07416\" (UID: \"9a6b497b-a168-4b3d-b259-07fc47a07416\") " Apr 22 16:51:25.619228 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.619208 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbmhg\" (UniqueName: \"kubernetes.io/projected/9a6b497b-a168-4b3d-b259-07fc47a07416-kube-api-access-sbmhg\") pod \"9a6b497b-a168-4b3d-b259-07fc47a07416\" (UID: \"9a6b497b-a168-4b3d-b259-07fc47a07416\") " Apr 22 16:51:25.619442 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.619308 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a6b497b-a168-4b3d-b259-07fc47a07416-util\") pod \"9a6b497b-a168-4b3d-b259-07fc47a07416\" (UID: \"9a6b497b-a168-4b3d-b259-07fc47a07416\") " Apr 22 16:51:25.619633 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.619608 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a6b497b-a168-4b3d-b259-07fc47a07416-bundle" (OuterVolumeSpecName: "bundle") pod "9a6b497b-a168-4b3d-b259-07fc47a07416" (UID: "9a6b497b-a168-4b3d-b259-07fc47a07416"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:51:25.621406 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.621385 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a6b497b-a168-4b3d-b259-07fc47a07416-kube-api-access-sbmhg" (OuterVolumeSpecName: "kube-api-access-sbmhg") pod "9a6b497b-a168-4b3d-b259-07fc47a07416" (UID: "9a6b497b-a168-4b3d-b259-07fc47a07416"). InnerVolumeSpecName "kube-api-access-sbmhg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:51:25.623798 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.623766 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a6b497b-a168-4b3d-b259-07fc47a07416-util" (OuterVolumeSpecName: "util") pod "9a6b497b-a168-4b3d-b259-07fc47a07416" (UID: "9a6b497b-a168-4b3d-b259-07fc47a07416"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:51:25.644200 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.644179 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" Apr 22 16:51:25.720660 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.720626 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k47sl\" (UniqueName: \"kubernetes.io/projected/590bf948-b178-4415-b16b-a6339ab50c2c-kube-api-access-k47sl\") pod \"590bf948-b178-4415-b16b-a6339ab50c2c\" (UID: \"590bf948-b178-4415-b16b-a6339ab50c2c\") " Apr 22 16:51:25.720855 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.720717 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/590bf948-b178-4415-b16b-a6339ab50c2c-bundle\") pod \"590bf948-b178-4415-b16b-a6339ab50c2c\" (UID: \"590bf948-b178-4415-b16b-a6339ab50c2c\") " Apr 22 16:51:25.720855 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.720763 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/590bf948-b178-4415-b16b-a6339ab50c2c-util\") pod \"590bf948-b178-4415-b16b-a6339ab50c2c\" (UID: \"590bf948-b178-4415-b16b-a6339ab50c2c\") " Apr 22 16:51:25.720970 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.720900 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a6b497b-a168-4b3d-b259-07fc47a07416-util\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:51:25.720970 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.720910 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a6b497b-a168-4b3d-b259-07fc47a07416-bundle\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:51:25.720970 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.720919 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sbmhg\" (UniqueName: \"kubernetes.io/projected/9a6b497b-a168-4b3d-b259-07fc47a07416-kube-api-access-sbmhg\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:51:25.721381 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.721353 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/590bf948-b178-4415-b16b-a6339ab50c2c-bundle" (OuterVolumeSpecName: "bundle") pod "590bf948-b178-4415-b16b-a6339ab50c2c" (UID: "590bf948-b178-4415-b16b-a6339ab50c2c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:51:25.722843 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.722824 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590bf948-b178-4415-b16b-a6339ab50c2c-kube-api-access-k47sl" (OuterVolumeSpecName: "kube-api-access-k47sl") pod "590bf948-b178-4415-b16b-a6339ab50c2c" (UID: "590bf948-b178-4415-b16b-a6339ab50c2c"). InnerVolumeSpecName "kube-api-access-k47sl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:51:25.725949 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.725927 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/590bf948-b178-4415-b16b-a6339ab50c2c-util" (OuterVolumeSpecName: "util") pod "590bf948-b178-4415-b16b-a6339ab50c2c" (UID: "590bf948-b178-4415-b16b-a6339ab50c2c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:51:25.822004 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.821959 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/590bf948-b178-4415-b16b-a6339ab50c2c-bundle\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:51:25.822004 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.822001 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/590bf948-b178-4415-b16b-a6339ab50c2c-util\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:51:25.822004 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:25.822011 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k47sl\" (UniqueName: \"kubernetes.io/projected/590bf948-b178-4415-b16b-a6339ab50c2c-kube-api-access-k47sl\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:51:26.359444 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:26.359402 2575 generic.go:358] "Generic (PLEG): container finished" podID="18d8b17a-1897-4266-9555-e6c7d2803a15" containerID="1493a339d3280a13371fe550840391e5303c823d68943be468319ea6f959c9da" exitCode=0 Apr 22 16:51:26.359862 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:26.359492 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" event={"ID":"18d8b17a-1897-4266-9555-e6c7d2803a15","Type":"ContainerDied","Data":"1493a339d3280a13371fe550840391e5303c823d68943be468319ea6f959c9da"} Apr 22 16:51:26.361276 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:26.361256 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" Apr 22 16:51:26.361276 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:26.361265 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k" event={"ID":"9a6b497b-a168-4b3d-b259-07fc47a07416","Type":"ContainerDied","Data":"1548ee50559fddc2bbcd411fd7989bd7c3d9319e15b0e503fb126b9b31fd69ea"} Apr 22 16:51:26.361442 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:26.361292 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1548ee50559fddc2bbcd411fd7989bd7c3d9319e15b0e503fb126b9b31fd69ea" Apr 22 16:51:26.363109 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:26.363089 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" event={"ID":"590bf948-b178-4415-b16b-a6339ab50c2c","Type":"ContainerDied","Data":"fdca5b9f433cd67f376987b968f6b843df3bc00c75f97dde40562d4083532b43"} Apr 22 16:51:26.363109 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:26.363101 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v" Apr 22 16:51:26.363238 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:26.363114 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdca5b9f433cd67f376987b968f6b843df3bc00c75f97dde40562d4083532b43" Apr 22 16:51:26.365279 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:26.365259 2575 generic.go:358] "Generic (PLEG): container finished" podID="8c5454eb-b854-441d-a6c7-44481e739f60" containerID="d92f8e6aa219ebeb3651069dfe26a97f7fb3b257b27de03bb68a73945499dc28" exitCode=0 Apr 22 16:51:26.365382 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:26.365299 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" event={"ID":"8c5454eb-b854-441d-a6c7-44481e739f60","Type":"ContainerDied","Data":"d92f8e6aa219ebeb3651069dfe26a97f7fb3b257b27de03bb68a73945499dc28"} Apr 22 16:51:27.493118 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.493094 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" Apr 22 16:51:27.526492 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.526469 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" Apr 22 16:51:27.639096 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.638996 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18d8b17a-1897-4266-9555-e6c7d2803a15-util\") pod \"18d8b17a-1897-4266-9555-e6c7d2803a15\" (UID: \"18d8b17a-1897-4266-9555-e6c7d2803a15\") " Apr 22 16:51:27.639096 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.639031 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18d8b17a-1897-4266-9555-e6c7d2803a15-bundle\") pod \"18d8b17a-1897-4266-9555-e6c7d2803a15\" (UID: \"18d8b17a-1897-4266-9555-e6c7d2803a15\") " Apr 22 16:51:27.639096 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.639085 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6gf2\" (UniqueName: \"kubernetes.io/projected/18d8b17a-1897-4266-9555-e6c7d2803a15-kube-api-access-c6gf2\") pod \"18d8b17a-1897-4266-9555-e6c7d2803a15\" (UID: \"18d8b17a-1897-4266-9555-e6c7d2803a15\") " Apr 22 16:51:27.639342 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.639110 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pwzc\" (UniqueName: \"kubernetes.io/projected/8c5454eb-b854-441d-a6c7-44481e739f60-kube-api-access-2pwzc\") pod \"8c5454eb-b854-441d-a6c7-44481e739f60\" (UID: \"8c5454eb-b854-441d-a6c7-44481e739f60\") " Apr 22 16:51:27.639342 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.639130 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8c5454eb-b854-441d-a6c7-44481e739f60-bundle\") pod \"8c5454eb-b854-441d-a6c7-44481e739f60\" (UID: \"8c5454eb-b854-441d-a6c7-44481e739f60\") " Apr 22 16:51:27.639342 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.639211 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8c5454eb-b854-441d-a6c7-44481e739f60-util\") pod \"8c5454eb-b854-441d-a6c7-44481e739f60\" (UID: \"8c5454eb-b854-441d-a6c7-44481e739f60\") " Apr 22 16:51:27.639630 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.639605 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18d8b17a-1897-4266-9555-e6c7d2803a15-bundle" (OuterVolumeSpecName: "bundle") pod "18d8b17a-1897-4266-9555-e6c7d2803a15" (UID: "18d8b17a-1897-4266-9555-e6c7d2803a15"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:51:27.639816 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.639789 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c5454eb-b854-441d-a6c7-44481e739f60-bundle" (OuterVolumeSpecName: "bundle") pod "8c5454eb-b854-441d-a6c7-44481e739f60" (UID: "8c5454eb-b854-441d-a6c7-44481e739f60"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:51:27.641285 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.641257 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c5454eb-b854-441d-a6c7-44481e739f60-kube-api-access-2pwzc" (OuterVolumeSpecName: "kube-api-access-2pwzc") pod "8c5454eb-b854-441d-a6c7-44481e739f60" (UID: "8c5454eb-b854-441d-a6c7-44481e739f60"). InnerVolumeSpecName "kube-api-access-2pwzc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:51:27.641688 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.641660 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d8b17a-1897-4266-9555-e6c7d2803a15-kube-api-access-c6gf2" (OuterVolumeSpecName: "kube-api-access-c6gf2") pod "18d8b17a-1897-4266-9555-e6c7d2803a15" (UID: "18d8b17a-1897-4266-9555-e6c7d2803a15"). InnerVolumeSpecName "kube-api-access-c6gf2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:51:27.644677 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.644653 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18d8b17a-1897-4266-9555-e6c7d2803a15-util" (OuterVolumeSpecName: "util") pod "18d8b17a-1897-4266-9555-e6c7d2803a15" (UID: "18d8b17a-1897-4266-9555-e6c7d2803a15"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:51:27.645416 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.645392 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c5454eb-b854-441d-a6c7-44481e739f60-util" (OuterVolumeSpecName: "util") pod "8c5454eb-b854-441d-a6c7-44481e739f60" (UID: "8c5454eb-b854-441d-a6c7-44481e739f60"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:51:27.740579 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.740542 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18d8b17a-1897-4266-9555-e6c7d2803a15-bundle\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:51:27.740579 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.740574 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c6gf2\" (UniqueName: \"kubernetes.io/projected/18d8b17a-1897-4266-9555-e6c7d2803a15-kube-api-access-c6gf2\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:51:27.740579 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.740586 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2pwzc\" (UniqueName: \"kubernetes.io/projected/8c5454eb-b854-441d-a6c7-44481e739f60-kube-api-access-2pwzc\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:51:27.740795 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.740595 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8c5454eb-b854-441d-a6c7-44481e739f60-bundle\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:51:27.740795 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.740604 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8c5454eb-b854-441d-a6c7-44481e739f60-util\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:51:27.740795 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:27.740613 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18d8b17a-1897-4266-9555-e6c7d2803a15-util\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:51:28.375018 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:28.374979 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" event={"ID":"8c5454eb-b854-441d-a6c7-44481e739f60","Type":"ContainerDied","Data":"02acb074295875aecab3d1a80d0ab12e10bce4fbd48f085aff175fa5b79aabb7"} Apr 22 16:51:28.375018 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:28.375019 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02acb074295875aecab3d1a80d0ab12e10bce4fbd48f085aff175fa5b79aabb7" Apr 22 16:51:28.375266 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:28.375002 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr" Apr 22 16:51:28.376533 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:28.376507 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" event={"ID":"18d8b17a-1897-4266-9555-e6c7d2803a15","Type":"ContainerDied","Data":"65e0eb338113b75e509cf0c5f8f132a891cc4022ab42630ded56bf0f63483375"} Apr 22 16:51:28.376533 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:28.376521 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x" Apr 22 16:51:28.376533 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:28.376534 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65e0eb338113b75e509cf0c5f8f132a891cc4022ab42630ded56bf0f63483375" Apr 22 16:51:37.096762 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.096715 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7"] Apr 22 16:51:37.097274 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097253 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="590bf948-b178-4415-b16b-a6339ab50c2c" containerName="util" Apr 22 16:51:37.097339 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097279 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="590bf948-b178-4415-b16b-a6339ab50c2c" containerName="util" Apr 22 16:51:37.097339 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097293 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a6b497b-a168-4b3d-b259-07fc47a07416" containerName="extract" Apr 22 16:51:37.097339 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097302 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6b497b-a168-4b3d-b259-07fc47a07416" containerName="extract" Apr 22 16:51:37.097339 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097320 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="590bf948-b178-4415-b16b-a6339ab50c2c" containerName="pull" Apr 22 16:51:37.097339 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097329 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="590bf948-b178-4415-b16b-a6339ab50c2c" containerName="pull" Apr 22 16:51:37.097339 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097341 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c5454eb-b854-441d-a6c7-44481e739f60" containerName="pull" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097349 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5454eb-b854-441d-a6c7-44481e739f60" containerName="pull" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097365 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="590bf948-b178-4415-b16b-a6339ab50c2c" containerName="extract" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097373 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="590bf948-b178-4415-b16b-a6339ab50c2c" containerName="extract" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097386 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18d8b17a-1897-4266-9555-e6c7d2803a15" containerName="util" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097394 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d8b17a-1897-4266-9555-e6c7d2803a15" containerName="util" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097405 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a6b497b-a168-4b3d-b259-07fc47a07416" containerName="pull" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097414 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6b497b-a168-4b3d-b259-07fc47a07416" containerName="pull" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097424 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18d8b17a-1897-4266-9555-e6c7d2803a15" containerName="pull" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097433 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d8b17a-1897-4266-9555-e6c7d2803a15" containerName="pull" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097454 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a6b497b-a168-4b3d-b259-07fc47a07416" containerName="util" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097462 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6b497b-a168-4b3d-b259-07fc47a07416" containerName="util" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097479 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c5454eb-b854-441d-a6c7-44481e739f60" containerName="util" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097485 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5454eb-b854-441d-a6c7-44481e739f60" containerName="util" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097492 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18d8b17a-1897-4266-9555-e6c7d2803a15" containerName="extract" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097497 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d8b17a-1897-4266-9555-e6c7d2803a15" containerName="extract" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097503 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c5454eb-b854-441d-a6c7-44481e739f60" containerName="extract" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097507 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5454eb-b854-441d-a6c7-44481e739f60" containerName="extract" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097577 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="590bf948-b178-4415-b16b-a6339ab50c2c" containerName="extract" Apr 22 16:51:37.097574 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097586 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c5454eb-b854-441d-a6c7-44481e739f60" containerName="extract" Apr 22 16:51:37.098526 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097596 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a6b497b-a168-4b3d-b259-07fc47a07416" containerName="extract" Apr 22 16:51:37.098526 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.097602 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="18d8b17a-1897-4266-9555-e6c7d2803a15" containerName="extract" Apr 22 16:51:37.100295 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.100253 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" Apr 22 16:51:37.103457 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.103408 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-s6n7l\"" Apr 22 16:51:37.118901 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.118868 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7"] Apr 22 16:51:37.227954 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.227909 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6v5l\" (UniqueName: \"kubernetes.io/projected/062af3a6-17e1-43b9-9137-dd9a6a35adbc-kube-api-access-w6v5l\") pod \"limitador-operator-controller-manager-85c4996f8c-hd2b7\" (UID: \"062af3a6-17e1-43b9-9137-dd9a6a35adbc\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" Apr 22 16:51:37.329106 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.329068 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6v5l\" (UniqueName: \"kubernetes.io/projected/062af3a6-17e1-43b9-9137-dd9a6a35adbc-kube-api-access-w6v5l\") pod \"limitador-operator-controller-manager-85c4996f8c-hd2b7\" (UID: \"062af3a6-17e1-43b9-9137-dd9a6a35adbc\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" Apr 22 16:51:37.337108 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.337086 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6v5l\" (UniqueName: \"kubernetes.io/projected/062af3a6-17e1-43b9-9137-dd9a6a35adbc-kube-api-access-w6v5l\") pod \"limitador-operator-controller-manager-85c4996f8c-hd2b7\" (UID: \"062af3a6-17e1-43b9-9137-dd9a6a35adbc\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" Apr 22 16:51:37.417482 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.417401 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" Apr 22 16:51:37.548765 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:37.548736 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7"] Apr 22 16:51:37.551593 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:51:37.551564 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod062af3a6_17e1_43b9_9137_dd9a6a35adbc.slice/crio-0180491a9c92b94b5dc9a73a32bf5260836b47a2c3b77fdda2502ece4822870a WatchSource:0}: Error finding container 0180491a9c92b94b5dc9a73a32bf5260836b47a2c3b77fdda2502ece4822870a: Status 404 returned error can't find the container with id 0180491a9c92b94b5dc9a73a32bf5260836b47a2c3b77fdda2502ece4822870a Apr 22 16:51:38.414899 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:38.414858 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" event={"ID":"062af3a6-17e1-43b9-9137-dd9a6a35adbc","Type":"ContainerStarted","Data":"0180491a9c92b94b5dc9a73a32bf5260836b47a2c3b77fdda2502ece4822870a"} Apr 22 16:51:39.419782 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:39.419749 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" event={"ID":"062af3a6-17e1-43b9-9137-dd9a6a35adbc","Type":"ContainerStarted","Data":"f812820e56ac177a736d3786d47bd8f69d9b58237f7e9143386b16a3cf6b7261"} Apr 22 16:51:39.420070 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:39.419810 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" Apr 22 16:51:39.437939 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:39.437897 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" podStartSLOduration=0.666202332 podStartE2EDuration="2.437882627s" podCreationTimestamp="2026-04-22 16:51:37 +0000 UTC" firstStartedPulling="2026-04-22 16:51:37.553638051 +0000 UTC m=+1797.255656139" lastFinishedPulling="2026-04-22 16:51:39.325318347 +0000 UTC m=+1799.027336434" observedRunningTime="2026-04-22 16:51:39.435967284 +0000 UTC m=+1799.137985392" watchObservedRunningTime="2026-04-22 16:51:39.437882627 +0000 UTC m=+1799.139900736" Apr 22 16:51:40.895410 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:40.895302 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:51:40.929643 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:40.895615 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:51:46.157032 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:46.156997 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85"] Apr 22 16:51:46.160784 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:46.160764 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" Apr 22 16:51:46.163236 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:46.163215 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-75bhj\"" Apr 22 16:51:46.175725 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:46.175699 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85"] Apr 22 16:51:46.305790 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:46.305754 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slkbb\" (UniqueName: \"kubernetes.io/projected/ac860f53-90a3-4483-9bb6-d7e364035e3d-kube-api-access-slkbb\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-f9r85\" (UID: \"ac860f53-90a3-4483-9bb6-d7e364035e3d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" Apr 22 16:51:46.305960 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:46.305832 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ac860f53-90a3-4483-9bb6-d7e364035e3d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-f9r85\" (UID: \"ac860f53-90a3-4483-9bb6-d7e364035e3d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" Apr 22 16:51:46.406309 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:46.406276 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-slkbb\" (UniqueName: \"kubernetes.io/projected/ac860f53-90a3-4483-9bb6-d7e364035e3d-kube-api-access-slkbb\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-f9r85\" (UID: \"ac860f53-90a3-4483-9bb6-d7e364035e3d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" Apr 22 16:51:46.406477 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:46.406332 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ac860f53-90a3-4483-9bb6-d7e364035e3d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-f9r85\" (UID: \"ac860f53-90a3-4483-9bb6-d7e364035e3d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" Apr 22 16:51:46.406637 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:46.406619 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ac860f53-90a3-4483-9bb6-d7e364035e3d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-f9r85\" (UID: \"ac860f53-90a3-4483-9bb6-d7e364035e3d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" Apr 22 16:51:46.417545 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:46.417479 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-slkbb\" (UniqueName: \"kubernetes.io/projected/ac860f53-90a3-4483-9bb6-d7e364035e3d-kube-api-access-slkbb\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-f9r85\" (UID: \"ac860f53-90a3-4483-9bb6-d7e364035e3d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" Apr 22 16:51:46.470872 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:46.470836 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" Apr 22 16:51:46.594432 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:46.594389 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85"] Apr 22 16:51:46.597856 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:51:46.597821 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac860f53_90a3_4483_9bb6_d7e364035e3d.slice/crio-f9db2b0d11e2a84e9f95f9afa8f5c115e95a01f4a2e3b98336f0528269e66d4d WatchSource:0}: Error finding container f9db2b0d11e2a84e9f95f9afa8f5c115e95a01f4a2e3b98336f0528269e66d4d: Status 404 returned error can't find the container with id f9db2b0d11e2a84e9f95f9afa8f5c115e95a01f4a2e3b98336f0528269e66d4d Apr 22 16:51:47.449475 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:47.449440 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" event={"ID":"ac860f53-90a3-4483-9bb6-d7e364035e3d","Type":"ContainerStarted","Data":"f9db2b0d11e2a84e9f95f9afa8f5c115e95a01f4a2e3b98336f0528269e66d4d"} Apr 22 16:51:50.425454 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:50.425423 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" Apr 22 16:51:52.472096 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:52.472024 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" event={"ID":"ac860f53-90a3-4483-9bb6-d7e364035e3d","Type":"ContainerStarted","Data":"91db3cee07310f3f4d63b9c731ca9569cf5ef831dbe9d069b00c0d6a800e96ad"} Apr 22 16:51:52.472530 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:52.472130 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" Apr 22 16:51:52.497074 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:51:52.497010 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" podStartSLOduration=1.416444694 podStartE2EDuration="6.496995895s" podCreationTimestamp="2026-04-22 16:51:46 +0000 UTC" firstStartedPulling="2026-04-22 16:51:46.60026689 +0000 UTC m=+1806.302284977" lastFinishedPulling="2026-04-22 16:51:51.680818091 +0000 UTC m=+1811.382836178" observedRunningTime="2026-04-22 16:51:52.494560668 +0000 UTC m=+1812.196578778" watchObservedRunningTime="2026-04-22 16:51:52.496995895 +0000 UTC m=+1812.199014004" Apr 22 16:52:03.477833 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:03.477804 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" Apr 22 16:52:05.245757 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.245721 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85"] Apr 22 16:52:05.246172 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.245911 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" podUID="ac860f53-90a3-4483-9bb6-d7e364035e3d" containerName="manager" containerID="cri-o://91db3cee07310f3f4d63b9c731ca9569cf5ef831dbe9d069b00c0d6a800e96ad" gracePeriod=2 Apr 22 16:52:05.248875 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.248840 2575 status_manager.go:895] "Failed to get status for pod" podUID="ac860f53-90a3-4483-9bb6-d7e364035e3d" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-f9r85\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:05.252576 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.252550 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85"] Apr 22 16:52:05.267126 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.267099 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4"] Apr 22 16:52:05.267602 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.267584 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac860f53-90a3-4483-9bb6-d7e364035e3d" containerName="manager" Apr 22 16:52:05.267679 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.267604 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac860f53-90a3-4483-9bb6-d7e364035e3d" containerName="manager" Apr 22 16:52:05.267757 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.267744 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac860f53-90a3-4483-9bb6-d7e364035e3d" containerName="manager" Apr 22 16:52:05.270993 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.270972 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" Apr 22 16:52:05.283147 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.283117 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4"] Apr 22 16:52:05.290857 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.290829 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7"] Apr 22 16:52:05.291116 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.291090 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" podUID="062af3a6-17e1-43b9-9137-dd9a6a35adbc" containerName="manager" containerID="cri-o://f812820e56ac177a736d3786d47bd8f69d9b58237f7e9143386b16a3cf6b7261" gracePeriod=2 Apr 22 16:52:05.296498 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.296460 2575 status_manager.go:895] "Failed to get status for pod" podUID="ac860f53-90a3-4483-9bb6-d7e364035e3d" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-f9r85\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:05.298415 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.298381 2575 status_manager.go:895] "Failed to get status for pod" podUID="ac860f53-90a3-4483-9bb6-d7e364035e3d" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-f9r85\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:05.305406 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.305380 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7"] Apr 22 16:52:05.321516 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.321486 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjmf8"] Apr 22 16:52:05.322067 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.322021 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="062af3a6-17e1-43b9-9137-dd9a6a35adbc" containerName="manager" Apr 22 16:52:05.322067 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.322059 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="062af3a6-17e1-43b9-9137-dd9a6a35adbc" containerName="manager" Apr 22 16:52:05.322194 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.322142 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="062af3a6-17e1-43b9-9137-dd9a6a35adbc" containerName="manager" Apr 22 16:52:05.325472 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.325451 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjmf8" Apr 22 16:52:05.327742 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.327715 2575 status_manager.go:895] "Failed to get status for pod" podUID="ac860f53-90a3-4483-9bb6-d7e364035e3d" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-f9r85\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:05.334063 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.334021 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjmf8"] Apr 22 16:52:05.353776 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.353733 2575 status_manager.go:895] "Failed to get status for pod" podUID="062af3a6-17e1-43b9-9137-dd9a6a35adbc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" err="pods \"limitador-operator-controller-manager-85c4996f8c-hd2b7\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:05.371174 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.371153 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6fwt4\" (UID: \"3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" Apr 22 16:52:05.371266 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.371193 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzqvc\" (UniqueName: \"kubernetes.io/projected/3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26-kube-api-access-nzqvc\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6fwt4\" (UID: \"3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" Apr 22 16:52:05.371314 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.371302 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fbkx\" (UniqueName: \"kubernetes.io/projected/ab8ddab0-f0c5-4474-8120-fa85e91529c2-kube-api-access-7fbkx\") pod \"limitador-operator-controller-manager-85c4996f8c-xjmf8\" (UID: \"ab8ddab0-f0c5-4474-8120-fa85e91529c2\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjmf8" Apr 22 16:52:05.472738 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.472702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fbkx\" (UniqueName: \"kubernetes.io/projected/ab8ddab0-f0c5-4474-8120-fa85e91529c2-kube-api-access-7fbkx\") pod \"limitador-operator-controller-manager-85c4996f8c-xjmf8\" (UID: \"ab8ddab0-f0c5-4474-8120-fa85e91529c2\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjmf8" Apr 22 16:52:05.472867 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.472750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6fwt4\" (UID: \"3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" Apr 22 16:52:05.472867 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.472791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzqvc\" (UniqueName: \"kubernetes.io/projected/3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26-kube-api-access-nzqvc\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6fwt4\" (UID: \"3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" Apr 22 16:52:05.473249 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.473226 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6fwt4\" (UID: \"3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" Apr 22 16:52:05.483086 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.483030 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fbkx\" (UniqueName: \"kubernetes.io/projected/ab8ddab0-f0c5-4474-8120-fa85e91529c2-kube-api-access-7fbkx\") pod \"limitador-operator-controller-manager-85c4996f8c-xjmf8\" (UID: \"ab8ddab0-f0c5-4474-8120-fa85e91529c2\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjmf8" Apr 22 16:52:05.483182 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.483107 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzqvc\" (UniqueName: \"kubernetes.io/projected/3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26-kube-api-access-nzqvc\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6fwt4\" (UID: \"3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" Apr 22 16:52:05.508587 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.508564 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" Apr 22 16:52:05.510717 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.510691 2575 status_manager.go:895] "Failed to get status for pod" podUID="ac860f53-90a3-4483-9bb6-d7e364035e3d" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-f9r85\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:05.512665 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.512636 2575 status_manager.go:895] "Failed to get status for pod" podUID="062af3a6-17e1-43b9-9137-dd9a6a35adbc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" err="pods \"limitador-operator-controller-manager-85c4996f8c-hd2b7\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:05.522119 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.522087 2575 generic.go:358] "Generic (PLEG): container finished" podID="062af3a6-17e1-43b9-9137-dd9a6a35adbc" containerID="f812820e56ac177a736d3786d47bd8f69d9b58237f7e9143386b16a3cf6b7261" exitCode=0 Apr 22 16:52:05.523512 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.523486 2575 generic.go:358] "Generic (PLEG): container finished" podID="ac860f53-90a3-4483-9bb6-d7e364035e3d" containerID="91db3cee07310f3f4d63b9c731ca9569cf5ef831dbe9d069b00c0d6a800e96ad" exitCode=0 Apr 22 16:52:05.523610 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.523526 2575 scope.go:117] "RemoveContainer" containerID="91db3cee07310f3f4d63b9c731ca9569cf5ef831dbe9d069b00c0d6a800e96ad" Apr 22 16:52:05.523610 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.523554 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" Apr 22 16:52:05.530743 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.526495 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" Apr 22 16:52:05.530743 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.527172 2575 status_manager.go:895] "Failed to get status for pod" podUID="062af3a6-17e1-43b9-9137-dd9a6a35adbc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" err="pods \"limitador-operator-controller-manager-85c4996f8c-hd2b7\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:05.532508 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.532320 2575 status_manager.go:895] "Failed to get status for pod" podUID="ac860f53-90a3-4483-9bb6-d7e364035e3d" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-f9r85\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:05.534635 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.534612 2575 status_manager.go:895] "Failed to get status for pod" podUID="ac860f53-90a3-4483-9bb6-d7e364035e3d" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-f9r85\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:05.536384 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.536359 2575 status_manager.go:895] "Failed to get status for pod" podUID="062af3a6-17e1-43b9-9137-dd9a6a35adbc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" err="pods \"limitador-operator-controller-manager-85c4996f8c-hd2b7\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:05.537386 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.537369 2575 scope.go:117] "RemoveContainer" containerID="91db3cee07310f3f4d63b9c731ca9569cf5ef831dbe9d069b00c0d6a800e96ad" Apr 22 16:52:05.537650 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:52:05.537632 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91db3cee07310f3f4d63b9c731ca9569cf5ef831dbe9d069b00c0d6a800e96ad\": container with ID starting with 91db3cee07310f3f4d63b9c731ca9569cf5ef831dbe9d069b00c0d6a800e96ad not found: ID does not exist" containerID="91db3cee07310f3f4d63b9c731ca9569cf5ef831dbe9d069b00c0d6a800e96ad" Apr 22 16:52:05.537710 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.537663 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91db3cee07310f3f4d63b9c731ca9569cf5ef831dbe9d069b00c0d6a800e96ad"} err="failed to get container status \"91db3cee07310f3f4d63b9c731ca9569cf5ef831dbe9d069b00c0d6a800e96ad\": rpc error: code = NotFound desc = could not find container \"91db3cee07310f3f4d63b9c731ca9569cf5ef831dbe9d069b00c0d6a800e96ad\": container with ID starting with 91db3cee07310f3f4d63b9c731ca9569cf5ef831dbe9d069b00c0d6a800e96ad not found: ID does not exist" Apr 22 16:52:05.573252 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.573223 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ac860f53-90a3-4483-9bb6-d7e364035e3d-extensions-socket-volume\") pod \"ac860f53-90a3-4483-9bb6-d7e364035e3d\" (UID: \"ac860f53-90a3-4483-9bb6-d7e364035e3d\") " Apr 22 16:52:05.573399 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.573335 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6v5l\" (UniqueName: \"kubernetes.io/projected/062af3a6-17e1-43b9-9137-dd9a6a35adbc-kube-api-access-w6v5l\") pod \"062af3a6-17e1-43b9-9137-dd9a6a35adbc\" (UID: \"062af3a6-17e1-43b9-9137-dd9a6a35adbc\") " Apr 22 16:52:05.573399 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.573366 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slkbb\" (UniqueName: \"kubernetes.io/projected/ac860f53-90a3-4483-9bb6-d7e364035e3d-kube-api-access-slkbb\") pod \"ac860f53-90a3-4483-9bb6-d7e364035e3d\" (UID: \"ac860f53-90a3-4483-9bb6-d7e364035e3d\") " Apr 22 16:52:05.573768 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.573744 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac860f53-90a3-4483-9bb6-d7e364035e3d-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "ac860f53-90a3-4483-9bb6-d7e364035e3d" (UID: "ac860f53-90a3-4483-9bb6-d7e364035e3d"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:52:05.575617 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.575594 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/062af3a6-17e1-43b9-9137-dd9a6a35adbc-kube-api-access-w6v5l" (OuterVolumeSpecName: "kube-api-access-w6v5l") pod "062af3a6-17e1-43b9-9137-dd9a6a35adbc" (UID: "062af3a6-17e1-43b9-9137-dd9a6a35adbc"). InnerVolumeSpecName "kube-api-access-w6v5l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:52:05.575685 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.575609 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac860f53-90a3-4483-9bb6-d7e364035e3d-kube-api-access-slkbb" (OuterVolumeSpecName: "kube-api-access-slkbb") pod "ac860f53-90a3-4483-9bb6-d7e364035e3d" (UID: "ac860f53-90a3-4483-9bb6-d7e364035e3d"). InnerVolumeSpecName "kube-api-access-slkbb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:52:05.672112 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.672077 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" Apr 22 16:52:05.674300 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.674276 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w6v5l\" (UniqueName: \"kubernetes.io/projected/062af3a6-17e1-43b9-9137-dd9a6a35adbc-kube-api-access-w6v5l\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:52:05.674346 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.674307 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-slkbb\" (UniqueName: \"kubernetes.io/projected/ac860f53-90a3-4483-9bb6-d7e364035e3d-kube-api-access-slkbb\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:52:05.674346 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.674323 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ac860f53-90a3-4483-9bb6-d7e364035e3d-extensions-socket-volume\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:52:05.677971 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.677951 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjmf8" Apr 22 16:52:05.819488 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.819456 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4"] Apr 22 16:52:05.822580 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:52:05.822547 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d6c19f8_c2f1_4fb0_b7ab_9e9f25f0ae26.slice/crio-7fe6912f2d79e7d401c697c2e71630d177fc7ccdcc065130bed1be453ac655ca WatchSource:0}: Error finding container 7fe6912f2d79e7d401c697c2e71630d177fc7ccdcc065130bed1be453ac655ca: Status 404 returned error can't find the container with id 7fe6912f2d79e7d401c697c2e71630d177fc7ccdcc065130bed1be453ac655ca Apr 22 16:52:05.836648 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.836617 2575 status_manager.go:895] "Failed to get status for pod" podUID="ac860f53-90a3-4483-9bb6-d7e364035e3d" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-f9r85\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:05.838395 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.838357 2575 status_manager.go:895] "Failed to get status for pod" podUID="062af3a6-17e1-43b9-9137-dd9a6a35adbc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" err="pods \"limitador-operator-controller-manager-85c4996f8c-hd2b7\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:05.851288 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:05.851262 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjmf8"] Apr 22 16:52:05.852699 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:52:05.852671 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab8ddab0_f0c5_4474_8120_fa85e91529c2.slice/crio-f371c3fa4317ccc5d22b30981b6cbaccc473ad1f88d4f764bb2728de8a4675f1 WatchSource:0}: Error finding container f371c3fa4317ccc5d22b30981b6cbaccc473ad1f88d4f764bb2728de8a4675f1: Status 404 returned error can't find the container with id f371c3fa4317ccc5d22b30981b6cbaccc473ad1f88d4f764bb2728de8a4675f1 Apr 22 16:52:06.529209 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.529116 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" Apr 22 16:52:06.529209 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.529113 2575 scope.go:117] "RemoveContainer" containerID="f812820e56ac177a736d3786d47bd8f69d9b58237f7e9143386b16a3cf6b7261" Apr 22 16:52:06.530825 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.530796 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" event={"ID":"3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26","Type":"ContainerStarted","Data":"1576fb81ba9609e796397a3f60d4b5d7f2681a63b51297a27e472831902b0ec0"} Apr 22 16:52:06.530940 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.530835 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" event={"ID":"3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26","Type":"ContainerStarted","Data":"7fe6912f2d79e7d401c697c2e71630d177fc7ccdcc065130bed1be453ac655ca"} Apr 22 16:52:06.531003 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.530939 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" Apr 22 16:52:06.531469 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.531443 2575 status_manager.go:895] "Failed to get status for pod" podUID="ac860f53-90a3-4483-9bb6-d7e364035e3d" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-f9r85\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:06.533238 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.533210 2575 status_manager.go:895] "Failed to get status for pod" podUID="062af3a6-17e1-43b9-9137-dd9a6a35adbc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" err="pods \"limitador-operator-controller-manager-85c4996f8c-hd2b7\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:06.533482 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.533463 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjmf8" event={"ID":"ab8ddab0-f0c5-4474-8120-fa85e91529c2","Type":"ContainerStarted","Data":"8cb35ebda2315a0545f1080aafe7deb012c9ca575533c5887eac0bccc619906f"} Apr 22 16:52:06.533560 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.533488 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjmf8" event={"ID":"ab8ddab0-f0c5-4474-8120-fa85e91529c2","Type":"ContainerStarted","Data":"f371c3fa4317ccc5d22b30981b6cbaccc473ad1f88d4f764bb2728de8a4675f1"} Apr 22 16:52:06.533676 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.533653 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjmf8" Apr 22 16:52:06.534833 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.534809 2575 status_manager.go:895] "Failed to get status for pod" podUID="ac860f53-90a3-4483-9bb6-d7e364035e3d" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-f9r85\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:06.536502 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.536471 2575 status_manager.go:895] "Failed to get status for pod" podUID="062af3a6-17e1-43b9-9137-dd9a6a35adbc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" err="pods \"limitador-operator-controller-manager-85c4996f8c-hd2b7\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:06.556755 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.556720 2575 status_manager.go:895] "Failed to get status for pod" podUID="ac860f53-90a3-4483-9bb6-d7e364035e3d" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-f9r85" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-f9r85\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:06.557351 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.557308 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" podStartSLOduration=1.5572691079999998 podStartE2EDuration="1.557269108s" podCreationTimestamp="2026-04-22 16:52:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:52:06.55477772 +0000 UTC m=+1826.256795828" watchObservedRunningTime="2026-04-22 16:52:06.557269108 +0000 UTC m=+1826.259287217" Apr 22 16:52:06.581489 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.581448 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjmf8" podStartSLOduration=1.581437424 podStartE2EDuration="1.581437424s" podCreationTimestamp="2026-04-22 16:52:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:52:06.579278356 +0000 UTC m=+1826.281296464" watchObservedRunningTime="2026-04-22 16:52:06.581437424 +0000 UTC m=+1826.283455533" Apr 22 16:52:06.581672 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.581646 2575 status_manager.go:895] "Failed to get status for pod" podUID="062af3a6-17e1-43b9-9137-dd9a6a35adbc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-hd2b7" err="pods \"limitador-operator-controller-manager-85c4996f8c-hd2b7\" is forbidden: User \"system:node:ip-10-0-142-238.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-238.ec2.internal' and this object" Apr 22 16:52:06.905319 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.905241 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="062af3a6-17e1-43b9-9137-dd9a6a35adbc" path="/var/lib/kubelet/pods/062af3a6-17e1-43b9-9137-dd9a6a35adbc/volumes" Apr 22 16:52:06.905581 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:06.905567 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac860f53-90a3-4483-9bb6-d7e364035e3d" path="/var/lib/kubelet/pods/ac860f53-90a3-4483-9bb6-d7e364035e3d/volumes" Apr 22 16:52:17.540949 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:17.540913 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" Apr 22 16:52:17.541543 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:17.540974 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjmf8" Apr 22 16:52:21.945886 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:21.945846 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4"] Apr 22 16:52:21.946379 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:21.946122 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" podUID="3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26" containerName="manager" containerID="cri-o://1576fb81ba9609e796397a3f60d4b5d7f2681a63b51297a27e472831902b0ec0" gracePeriod=10 Apr 22 16:52:22.176544 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.176510 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv"] Apr 22 16:52:22.181002 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.180927 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" Apr 22 16:52:22.192111 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.192082 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv"] Apr 22 16:52:22.207836 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.207813 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" Apr 22 16:52:22.325014 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.324983 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26-extensions-socket-volume\") pod \"3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26\" (UID: \"3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26\") " Apr 22 16:52:22.325190 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.325096 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzqvc\" (UniqueName: \"kubernetes.io/projected/3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26-kube-api-access-nzqvc\") pod \"3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26\" (UID: \"3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26\") " Apr 22 16:52:22.325321 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.325300 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/603e9f8f-75a1-4d5b-8d68-44f5577361c5-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wwdgv\" (UID: \"603e9f8f-75a1-4d5b-8d68-44f5577361c5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" Apr 22 16:52:22.325374 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.325356 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xh29\" (UniqueName: \"kubernetes.io/projected/603e9f8f-75a1-4d5b-8d68-44f5577361c5-kube-api-access-9xh29\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wwdgv\" (UID: \"603e9f8f-75a1-4d5b-8d68-44f5577361c5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" Apr 22 16:52:22.325428 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.325413 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26" (UID: "3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:52:22.325553 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.325541 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26-extensions-socket-volume\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:52:22.327291 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.327263 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26-kube-api-access-nzqvc" (OuterVolumeSpecName: "kube-api-access-nzqvc") pod "3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26" (UID: "3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26"). InnerVolumeSpecName "kube-api-access-nzqvc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:52:22.425989 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.425948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/603e9f8f-75a1-4d5b-8d68-44f5577361c5-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wwdgv\" (UID: \"603e9f8f-75a1-4d5b-8d68-44f5577361c5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" Apr 22 16:52:22.425989 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.425995 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xh29\" (UniqueName: \"kubernetes.io/projected/603e9f8f-75a1-4d5b-8d68-44f5577361c5-kube-api-access-9xh29\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wwdgv\" (UID: \"603e9f8f-75a1-4d5b-8d68-44f5577361c5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" Apr 22 16:52:22.426224 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.426066 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nzqvc\" (UniqueName: \"kubernetes.io/projected/3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26-kube-api-access-nzqvc\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:52:22.426379 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.426357 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/603e9f8f-75a1-4d5b-8d68-44f5577361c5-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wwdgv\" (UID: \"603e9f8f-75a1-4d5b-8d68-44f5577361c5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" Apr 22 16:52:22.434131 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.434109 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xh29\" (UniqueName: \"kubernetes.io/projected/603e9f8f-75a1-4d5b-8d68-44f5577361c5-kube-api-access-9xh29\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wwdgv\" (UID: \"603e9f8f-75a1-4d5b-8d68-44f5577361c5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" Apr 22 16:52:22.506108 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.505995 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" Apr 22 16:52:22.605181 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.605150 2575 generic.go:358] "Generic (PLEG): container finished" podID="3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26" containerID="1576fb81ba9609e796397a3f60d4b5d7f2681a63b51297a27e472831902b0ec0" exitCode=0 Apr 22 16:52:22.605359 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.605231 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" Apr 22 16:52:22.605359 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.605234 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" event={"ID":"3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26","Type":"ContainerDied","Data":"1576fb81ba9609e796397a3f60d4b5d7f2681a63b51297a27e472831902b0ec0"} Apr 22 16:52:22.605359 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.605273 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4" event={"ID":"3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26","Type":"ContainerDied","Data":"7fe6912f2d79e7d401c697c2e71630d177fc7ccdcc065130bed1be453ac655ca"} Apr 22 16:52:22.605359 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.605289 2575 scope.go:117] "RemoveContainer" containerID="1576fb81ba9609e796397a3f60d4b5d7f2681a63b51297a27e472831902b0ec0" Apr 22 16:52:22.614860 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.614839 2575 scope.go:117] "RemoveContainer" containerID="1576fb81ba9609e796397a3f60d4b5d7f2681a63b51297a27e472831902b0ec0" Apr 22 16:52:22.615215 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:52:22.615182 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1576fb81ba9609e796397a3f60d4b5d7f2681a63b51297a27e472831902b0ec0\": container with ID starting with 1576fb81ba9609e796397a3f60d4b5d7f2681a63b51297a27e472831902b0ec0 not found: ID does not exist" containerID="1576fb81ba9609e796397a3f60d4b5d7f2681a63b51297a27e472831902b0ec0" Apr 22 16:52:22.615298 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.615226 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1576fb81ba9609e796397a3f60d4b5d7f2681a63b51297a27e472831902b0ec0"} err="failed to get container status \"1576fb81ba9609e796397a3f60d4b5d7f2681a63b51297a27e472831902b0ec0\": rpc error: code = NotFound desc = could not find container \"1576fb81ba9609e796397a3f60d4b5d7f2681a63b51297a27e472831902b0ec0\": container with ID starting with 1576fb81ba9609e796397a3f60d4b5d7f2681a63b51297a27e472831902b0ec0 not found: ID does not exist" Apr 22 16:52:22.628528 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.628502 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4"] Apr 22 16:52:22.633203 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.632990 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv"] Apr 22 16:52:22.635349 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.635328 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6fwt4"] Apr 22 16:52:22.636195 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:52:22.636173 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod603e9f8f_75a1_4d5b_8d68_44f5577361c5.slice/crio-8e1f7357ed495cda2879c78544edec30271053a45f0cf1168c6e41838ef41dca WatchSource:0}: Error finding container 8e1f7357ed495cda2879c78544edec30271053a45f0cf1168c6e41838ef41dca: Status 404 returned error can't find the container with id 8e1f7357ed495cda2879c78544edec30271053a45f0cf1168c6e41838ef41dca Apr 22 16:52:22.906143 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:22.906066 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26" path="/var/lib/kubelet/pods/3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26/volumes" Apr 22 16:52:23.612472 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:23.612436 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" event={"ID":"603e9f8f-75a1-4d5b-8d68-44f5577361c5","Type":"ContainerStarted","Data":"22f6f9f395218acddb07bc06e935e57999990e984858e1005bfc99c5b02678e9"} Apr 22 16:52:23.612472 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:23.612471 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" event={"ID":"603e9f8f-75a1-4d5b-8d68-44f5577361c5","Type":"ContainerStarted","Data":"8e1f7357ed495cda2879c78544edec30271053a45f0cf1168c6e41838ef41dca"} Apr 22 16:52:23.612862 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:23.612575 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" Apr 22 16:52:23.654647 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:23.654607 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" podStartSLOduration=1.654593609 podStartE2EDuration="1.654593609s" podCreationTimestamp="2026-04-22 16:52:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:52:23.652903403 +0000 UTC m=+1843.354921513" watchObservedRunningTime="2026-04-22 16:52:23.654593609 +0000 UTC m=+1843.356611717" Apr 22 16:52:34.619097 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:34.619021 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" Apr 22 16:52:38.212100 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.212005 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5"] Apr 22 16:52:38.212540 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.212475 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26" containerName="manager" Apr 22 16:52:38.212540 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.212493 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26" containerName="manager" Apr 22 16:52:38.212673 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.212595 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d6c19f8-c2f1-4fb0-b7ab-9e9f25f0ae26" containerName="manager" Apr 22 16:52:38.215951 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.215930 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.219182 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.219158 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-tfxk7\"" Apr 22 16:52:38.228095 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.227520 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5"] Apr 22 16:52:38.264939 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.264908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.265090 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.264946 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgh6m\" (UniqueName: \"kubernetes.io/projected/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-kube-api-access-zgh6m\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.265090 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.264998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.265090 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.265022 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.265090 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.265062 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.265090 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.265088 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.265388 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.265117 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.265388 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.265148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.265388 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.265167 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.365573 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.365541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.365739 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.365580 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.365739 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.365598 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.365739 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.365633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.365739 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.365664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgh6m\" (UniqueName: \"kubernetes.io/projected/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-kube-api-access-zgh6m\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.365739 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.365720 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.366019 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.365757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.366019 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.365948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.366019 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.365997 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.366231 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.366191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.366331 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.366309 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.366386 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.366327 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.366425 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.366322 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.366626 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.366607 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.368020 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.367996 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.368276 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.368257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.373214 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.373193 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.373499 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.373477 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgh6m\" (UniqueName: \"kubernetes.io/projected/ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb-kube-api-access-zgh6m\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zxbs5\" (UID: \"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.530173 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.530097 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:38.658668 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.658637 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5"] Apr 22 16:52:38.659729 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:52:38.659706 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddc3ac5b_9a8f_444e_8119_aca65c5a6fcb.slice/crio-57a6e8244fa218be0ebdd574f7c0b5e2eb8a151f13b9c10f1832f7e93ef0ec50 WatchSource:0}: Error finding container 57a6e8244fa218be0ebdd574f7c0b5e2eb8a151f13b9c10f1832f7e93ef0ec50: Status 404 returned error can't find the container with id 57a6e8244fa218be0ebdd574f7c0b5e2eb8a151f13b9c10f1832f7e93ef0ec50 Apr 22 16:52:38.661958 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.661927 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 22 16:52:38.662062 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.661988 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 22 16:52:38.662062 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.662017 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 22 16:52:38.672218 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:38.672190 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" event={"ID":"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb","Type":"ContainerStarted","Data":"57a6e8244fa218be0ebdd574f7c0b5e2eb8a151f13b9c10f1832f7e93ef0ec50"} Apr 22 16:52:39.678005 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:39.677974 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" event={"ID":"ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb","Type":"ContainerStarted","Data":"73a2d5f91e44e84bc19bfc7f4b8ae09d876e462bf552601fcb2f5ef7d21c7fd1"} Apr 22 16:52:39.698071 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:39.698015 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" podStartSLOduration=1.6979991399999999 podStartE2EDuration="1.69799914s" podCreationTimestamp="2026-04-22 16:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:52:39.695088139 +0000 UTC m=+1859.397106241" watchObservedRunningTime="2026-04-22 16:52:39.69799914 +0000 UTC m=+1859.400017252" Apr 22 16:52:40.531142 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:40.531097 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:40.535927 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:40.535900 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:40.685156 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:40.685118 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:40.686137 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:40.686111 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zxbs5" Apr 22 16:52:42.982350 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:42.982320 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-2ncwf"] Apr 22 16:52:42.986991 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:42.986969 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-2ncwf" Apr 22 16:52:42.989795 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:42.989774 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-wgb6c\"" Apr 22 16:52:42.993101 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:42.993075 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-2ncwf"] Apr 22 16:52:43.000233 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:43.000210 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58q54\" (UniqueName: \"kubernetes.io/projected/531c54ee-be4c-491d-acab-c09a3c940055-kube-api-access-58q54\") pod \"authorino-f99f4b5cd-2ncwf\" (UID: \"531c54ee-be4c-491d-acab-c09a3c940055\") " pod="kuadrant-system/authorino-f99f4b5cd-2ncwf" Apr 22 16:52:43.094070 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:43.094011 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-xwt5q"] Apr 22 16:52:43.097456 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:43.097433 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-xwt5q" Apr 22 16:52:43.100819 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:43.100797 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58q54\" (UniqueName: \"kubernetes.io/projected/531c54ee-be4c-491d-acab-c09a3c940055-kube-api-access-58q54\") pod \"authorino-f99f4b5cd-2ncwf\" (UID: \"531c54ee-be4c-491d-acab-c09a3c940055\") " pod="kuadrant-system/authorino-f99f4b5cd-2ncwf" Apr 22 16:52:43.100942 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:43.100828 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r56qq\" (UniqueName: \"kubernetes.io/projected/a869ea81-bd1b-4b61-9b20-8fbfb8cb2796-kube-api-access-r56qq\") pod \"authorino-7498df8756-xwt5q\" (UID: \"a869ea81-bd1b-4b61-9b20-8fbfb8cb2796\") " pod="kuadrant-system/authorino-7498df8756-xwt5q" Apr 22 16:52:43.101733 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:43.101706 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-xwt5q"] Apr 22 16:52:43.108445 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:43.108425 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58q54\" (UniqueName: \"kubernetes.io/projected/531c54ee-be4c-491d-acab-c09a3c940055-kube-api-access-58q54\") pod \"authorino-f99f4b5cd-2ncwf\" (UID: \"531c54ee-be4c-491d-acab-c09a3c940055\") " pod="kuadrant-system/authorino-f99f4b5cd-2ncwf" Apr 22 16:52:43.202061 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:43.202002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r56qq\" (UniqueName: \"kubernetes.io/projected/a869ea81-bd1b-4b61-9b20-8fbfb8cb2796-kube-api-access-r56qq\") pod \"authorino-7498df8756-xwt5q\" (UID: \"a869ea81-bd1b-4b61-9b20-8fbfb8cb2796\") " pod="kuadrant-system/authorino-7498df8756-xwt5q" Apr 22 16:52:43.211696 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:43.211675 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r56qq\" (UniqueName: \"kubernetes.io/projected/a869ea81-bd1b-4b61-9b20-8fbfb8cb2796-kube-api-access-r56qq\") pod \"authorino-7498df8756-xwt5q\" (UID: \"a869ea81-bd1b-4b61-9b20-8fbfb8cb2796\") " pod="kuadrant-system/authorino-7498df8756-xwt5q" Apr 22 16:52:43.298460 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:43.298378 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-2ncwf" Apr 22 16:52:43.407509 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:43.407481 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-xwt5q" Apr 22 16:52:43.423238 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:43.423091 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-2ncwf"] Apr 22 16:52:43.427181 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:52:43.427154 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod531c54ee_be4c_491d_acab_c09a3c940055.slice/crio-916d766325e4d91054757dee919d401414f92478ce7d99c6773a7efdb18e1c56 WatchSource:0}: Error finding container 916d766325e4d91054757dee919d401414f92478ce7d99c6773a7efdb18e1c56: Status 404 returned error can't find the container with id 916d766325e4d91054757dee919d401414f92478ce7d99c6773a7efdb18e1c56 Apr 22 16:52:43.529327 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:43.529303 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-xwt5q"] Apr 22 16:52:43.531402 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:52:43.531374 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda869ea81_bd1b_4b61_9b20_8fbfb8cb2796.slice/crio-bde2e750359771baca5b29f8732f91fba95fd31a3c6b871c540d6b5716edf0f3 WatchSource:0}: Error finding container bde2e750359771baca5b29f8732f91fba95fd31a3c6b871c540d6b5716edf0f3: Status 404 returned error can't find the container with id bde2e750359771baca5b29f8732f91fba95fd31a3c6b871c540d6b5716edf0f3 Apr 22 16:52:43.696497 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:43.696468 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-2ncwf" event={"ID":"531c54ee-be4c-491d-acab-c09a3c940055","Type":"ContainerStarted","Data":"916d766325e4d91054757dee919d401414f92478ce7d99c6773a7efdb18e1c56"} Apr 22 16:52:43.697566 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:43.697538 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-xwt5q" event={"ID":"a869ea81-bd1b-4b61-9b20-8fbfb8cb2796","Type":"ContainerStarted","Data":"bde2e750359771baca5b29f8732f91fba95fd31a3c6b871c540d6b5716edf0f3"} Apr 22 16:52:47.718575 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:47.718536 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-xwt5q" event={"ID":"a869ea81-bd1b-4b61-9b20-8fbfb8cb2796","Type":"ContainerStarted","Data":"2f5b99e8d9e82d0d7858f7781c2fa87d6c90c0fe64eb15f5653038616694c05e"} Apr 22 16:52:47.719927 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:47.719906 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-2ncwf" event={"ID":"531c54ee-be4c-491d-acab-c09a3c940055","Type":"ContainerStarted","Data":"558d7c4eebbd5c4ebe80da6f5c2f9108797e99439283983ef6428bfd21cd1a3d"} Apr 22 16:52:47.733843 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:47.733798 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-xwt5q" podStartSLOduration=1.232188989 podStartE2EDuration="4.733786141s" podCreationTimestamp="2026-04-22 16:52:43 +0000 UTC" firstStartedPulling="2026-04-22 16:52:43.532695714 +0000 UTC m=+1863.234713802" lastFinishedPulling="2026-04-22 16:52:47.034292864 +0000 UTC m=+1866.736310954" observedRunningTime="2026-04-22 16:52:47.732902236 +0000 UTC m=+1867.434920345" watchObservedRunningTime="2026-04-22 16:52:47.733786141 +0000 UTC m=+1867.435804249" Apr 22 16:52:47.746204 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:47.746143 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-2ncwf" podStartSLOduration=2.15284163 podStartE2EDuration="5.746129451s" podCreationTimestamp="2026-04-22 16:52:42 +0000 UTC" firstStartedPulling="2026-04-22 16:52:43.428697172 +0000 UTC m=+1863.130715273" lastFinishedPulling="2026-04-22 16:52:47.021985004 +0000 UTC m=+1866.724003094" observedRunningTime="2026-04-22 16:52:47.745664812 +0000 UTC m=+1867.447682922" watchObservedRunningTime="2026-04-22 16:52:47.746129451 +0000 UTC m=+1867.448147560" Apr 22 16:52:47.774118 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:47.774087 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-2ncwf"] Apr 22 16:52:49.729085 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:49.729026 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-2ncwf" podUID="531c54ee-be4c-491d-acab-c09a3c940055" containerName="authorino" containerID="cri-o://558d7c4eebbd5c4ebe80da6f5c2f9108797e99439283983ef6428bfd21cd1a3d" gracePeriod=30 Apr 22 16:52:49.974284 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:49.974260 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-2ncwf" Apr 22 16:52:50.067300 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:50.067214 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58q54\" (UniqueName: \"kubernetes.io/projected/531c54ee-be4c-491d-acab-c09a3c940055-kube-api-access-58q54\") pod \"531c54ee-be4c-491d-acab-c09a3c940055\" (UID: \"531c54ee-be4c-491d-acab-c09a3c940055\") " Apr 22 16:52:50.069389 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:50.069364 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/531c54ee-be4c-491d-acab-c09a3c940055-kube-api-access-58q54" (OuterVolumeSpecName: "kube-api-access-58q54") pod "531c54ee-be4c-491d-acab-c09a3c940055" (UID: "531c54ee-be4c-491d-acab-c09a3c940055"). InnerVolumeSpecName "kube-api-access-58q54". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:52:50.168175 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:50.168143 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-58q54\" (UniqueName: \"kubernetes.io/projected/531c54ee-be4c-491d-acab-c09a3c940055-kube-api-access-58q54\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:52:50.734196 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:50.734162 2575 generic.go:358] "Generic (PLEG): container finished" podID="531c54ee-be4c-491d-acab-c09a3c940055" containerID="558d7c4eebbd5c4ebe80da6f5c2f9108797e99439283983ef6428bfd21cd1a3d" exitCode=0 Apr 22 16:52:50.734630 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:50.734213 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-2ncwf" Apr 22 16:52:50.734630 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:50.734247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-2ncwf" event={"ID":"531c54ee-be4c-491d-acab-c09a3c940055","Type":"ContainerDied","Data":"558d7c4eebbd5c4ebe80da6f5c2f9108797e99439283983ef6428bfd21cd1a3d"} Apr 22 16:52:50.734630 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:50.734289 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-2ncwf" event={"ID":"531c54ee-be4c-491d-acab-c09a3c940055","Type":"ContainerDied","Data":"916d766325e4d91054757dee919d401414f92478ce7d99c6773a7efdb18e1c56"} Apr 22 16:52:50.734630 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:50.734309 2575 scope.go:117] "RemoveContainer" containerID="558d7c4eebbd5c4ebe80da6f5c2f9108797e99439283983ef6428bfd21cd1a3d" Apr 22 16:52:50.744335 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:50.744316 2575 scope.go:117] "RemoveContainer" containerID="558d7c4eebbd5c4ebe80da6f5c2f9108797e99439283983ef6428bfd21cd1a3d" Apr 22 16:52:50.744590 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:52:50.744573 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"558d7c4eebbd5c4ebe80da6f5c2f9108797e99439283983ef6428bfd21cd1a3d\": container with ID starting with 558d7c4eebbd5c4ebe80da6f5c2f9108797e99439283983ef6428bfd21cd1a3d not found: ID does not exist" containerID="558d7c4eebbd5c4ebe80da6f5c2f9108797e99439283983ef6428bfd21cd1a3d" Apr 22 16:52:50.744655 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:50.744597 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"558d7c4eebbd5c4ebe80da6f5c2f9108797e99439283983ef6428bfd21cd1a3d"} err="failed to get container status \"558d7c4eebbd5c4ebe80da6f5c2f9108797e99439283983ef6428bfd21cd1a3d\": rpc error: code = NotFound desc = could not find container \"558d7c4eebbd5c4ebe80da6f5c2f9108797e99439283983ef6428bfd21cd1a3d\": container with ID starting with 558d7c4eebbd5c4ebe80da6f5c2f9108797e99439283983ef6428bfd21cd1a3d not found: ID does not exist" Apr 22 16:52:50.755540 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:50.755513 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-2ncwf"] Apr 22 16:52:50.759032 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:50.759010 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-2ncwf"] Apr 22 16:52:50.905795 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:52:50.905761 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="531c54ee-be4c-491d-acab-c09a3c940055" path="/var/lib/kubelet/pods/531c54ee-be4c-491d-acab-c09a3c940055/volumes" Apr 22 16:53:11.079367 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:11.079332 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-58786b46b-7vj5g"] Apr 22 16:53:11.079764 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:11.079732 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="531c54ee-be4c-491d-acab-c09a3c940055" containerName="authorino" Apr 22 16:53:11.079764 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:11.079744 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="531c54ee-be4c-491d-acab-c09a3c940055" containerName="authorino" Apr 22 16:53:11.079843 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:11.079802 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="531c54ee-be4c-491d-acab-c09a3c940055" containerName="authorino" Apr 22 16:53:11.082775 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:11.082754 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-58786b46b-7vj5g" Apr 22 16:53:11.084951 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:11.084930 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 22 16:53:11.090221 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:11.089858 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-58786b46b-7vj5g"] Apr 22 16:53:11.150738 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:11.150712 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/182f6f3f-f045-4acd-ba01-3a4d5b314bb0-tls-cert\") pod \"authorino-58786b46b-7vj5g\" (UID: \"182f6f3f-f045-4acd-ba01-3a4d5b314bb0\") " pod="kuadrant-system/authorino-58786b46b-7vj5g" Apr 22 16:53:11.150887 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:11.150799 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcfjj\" (UniqueName: \"kubernetes.io/projected/182f6f3f-f045-4acd-ba01-3a4d5b314bb0-kube-api-access-kcfjj\") pod \"authorino-58786b46b-7vj5g\" (UID: \"182f6f3f-f045-4acd-ba01-3a4d5b314bb0\") " pod="kuadrant-system/authorino-58786b46b-7vj5g" Apr 22 16:53:11.251434 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:11.251392 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/182f6f3f-f045-4acd-ba01-3a4d5b314bb0-tls-cert\") pod \"authorino-58786b46b-7vj5g\" (UID: \"182f6f3f-f045-4acd-ba01-3a4d5b314bb0\") " pod="kuadrant-system/authorino-58786b46b-7vj5g" Apr 22 16:53:11.251589 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:11.251530 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcfjj\" (UniqueName: \"kubernetes.io/projected/182f6f3f-f045-4acd-ba01-3a4d5b314bb0-kube-api-access-kcfjj\") pod \"authorino-58786b46b-7vj5g\" (UID: \"182f6f3f-f045-4acd-ba01-3a4d5b314bb0\") " pod="kuadrant-system/authorino-58786b46b-7vj5g" Apr 22 16:53:11.253986 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:11.253956 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/182f6f3f-f045-4acd-ba01-3a4d5b314bb0-tls-cert\") pod \"authorino-58786b46b-7vj5g\" (UID: \"182f6f3f-f045-4acd-ba01-3a4d5b314bb0\") " pod="kuadrant-system/authorino-58786b46b-7vj5g" Apr 22 16:53:11.259090 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:11.259072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcfjj\" (UniqueName: \"kubernetes.io/projected/182f6f3f-f045-4acd-ba01-3a4d5b314bb0-kube-api-access-kcfjj\") pod \"authorino-58786b46b-7vj5g\" (UID: \"182f6f3f-f045-4acd-ba01-3a4d5b314bb0\") " pod="kuadrant-system/authorino-58786b46b-7vj5g" Apr 22 16:53:11.392867 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:11.392782 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-58786b46b-7vj5g" Apr 22 16:53:11.511519 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:11.511323 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-58786b46b-7vj5g"] Apr 22 16:53:11.514327 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:53:11.514300 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod182f6f3f_f045_4acd_ba01_3a4d5b314bb0.slice/crio-34ed51b50254e82101bb765ea9edaed054dd2768816a7b0f2d6d2c738d584197 WatchSource:0}: Error finding container 34ed51b50254e82101bb765ea9edaed054dd2768816a7b0f2d6d2c738d584197: Status 404 returned error can't find the container with id 34ed51b50254e82101bb765ea9edaed054dd2768816a7b0f2d6d2c738d584197 Apr 22 16:53:11.820692 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:11.820655 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-58786b46b-7vj5g" event={"ID":"182f6f3f-f045-4acd-ba01-3a4d5b314bb0","Type":"ContainerStarted","Data":"34ed51b50254e82101bb765ea9edaed054dd2768816a7b0f2d6d2c738d584197"} Apr 22 16:53:12.825692 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:12.825655 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-58786b46b-7vj5g" event={"ID":"182f6f3f-f045-4acd-ba01-3a4d5b314bb0","Type":"ContainerStarted","Data":"1b68c22047d8f492ad52e81e8f06a7391a82052f9d5522226aa51c7d41769cd9"} Apr 22 16:53:12.840985 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:12.840938 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-58786b46b-7vj5g" podStartSLOduration=1.288902545 podStartE2EDuration="1.840924954s" podCreationTimestamp="2026-04-22 16:53:11 +0000 UTC" firstStartedPulling="2026-04-22 16:53:11.515629332 +0000 UTC m=+1891.217647420" lastFinishedPulling="2026-04-22 16:53:12.067651742 +0000 UTC m=+1891.769669829" observedRunningTime="2026-04-22 16:53:12.838452939 +0000 UTC m=+1892.540471047" watchObservedRunningTime="2026-04-22 16:53:12.840924954 +0000 UTC m=+1892.542943063" Apr 22 16:53:12.867984 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:12.867958 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-xwt5q"] Apr 22 16:53:12.868195 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:12.868154 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-xwt5q" podUID="a869ea81-bd1b-4b61-9b20-8fbfb8cb2796" containerName="authorino" containerID="cri-o://2f5b99e8d9e82d0d7858f7781c2fa87d6c90c0fe64eb15f5653038616694c05e" gracePeriod=30 Apr 22 16:53:13.107312 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:13.107283 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-xwt5q" Apr 22 16:53:13.271334 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:13.271295 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r56qq\" (UniqueName: \"kubernetes.io/projected/a869ea81-bd1b-4b61-9b20-8fbfb8cb2796-kube-api-access-r56qq\") pod \"a869ea81-bd1b-4b61-9b20-8fbfb8cb2796\" (UID: \"a869ea81-bd1b-4b61-9b20-8fbfb8cb2796\") " Apr 22 16:53:13.273479 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:13.273454 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a869ea81-bd1b-4b61-9b20-8fbfb8cb2796-kube-api-access-r56qq" (OuterVolumeSpecName: "kube-api-access-r56qq") pod "a869ea81-bd1b-4b61-9b20-8fbfb8cb2796" (UID: "a869ea81-bd1b-4b61-9b20-8fbfb8cb2796"). InnerVolumeSpecName "kube-api-access-r56qq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:53:13.372802 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:13.372723 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r56qq\" (UniqueName: \"kubernetes.io/projected/a869ea81-bd1b-4b61-9b20-8fbfb8cb2796-kube-api-access-r56qq\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:53:13.830578 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:13.830541 2575 generic.go:358] "Generic (PLEG): container finished" podID="a869ea81-bd1b-4b61-9b20-8fbfb8cb2796" containerID="2f5b99e8d9e82d0d7858f7781c2fa87d6c90c0fe64eb15f5653038616694c05e" exitCode=0 Apr 22 16:53:13.831120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:13.830594 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-xwt5q" Apr 22 16:53:13.831120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:13.830630 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-xwt5q" event={"ID":"a869ea81-bd1b-4b61-9b20-8fbfb8cb2796","Type":"ContainerDied","Data":"2f5b99e8d9e82d0d7858f7781c2fa87d6c90c0fe64eb15f5653038616694c05e"} Apr 22 16:53:13.831120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:13.830670 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-xwt5q" event={"ID":"a869ea81-bd1b-4b61-9b20-8fbfb8cb2796","Type":"ContainerDied","Data":"bde2e750359771baca5b29f8732f91fba95fd31a3c6b871c540d6b5716edf0f3"} Apr 22 16:53:13.831120 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:13.830690 2575 scope.go:117] "RemoveContainer" containerID="2f5b99e8d9e82d0d7858f7781c2fa87d6c90c0fe64eb15f5653038616694c05e" Apr 22 16:53:13.840282 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:13.840267 2575 scope.go:117] "RemoveContainer" containerID="2f5b99e8d9e82d0d7858f7781c2fa87d6c90c0fe64eb15f5653038616694c05e" Apr 22 16:53:13.840547 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:53:13.840528 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f5b99e8d9e82d0d7858f7781c2fa87d6c90c0fe64eb15f5653038616694c05e\": container with ID starting with 2f5b99e8d9e82d0d7858f7781c2fa87d6c90c0fe64eb15f5653038616694c05e not found: ID does not exist" containerID="2f5b99e8d9e82d0d7858f7781c2fa87d6c90c0fe64eb15f5653038616694c05e" Apr 22 16:53:13.840592 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:13.840556 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f5b99e8d9e82d0d7858f7781c2fa87d6c90c0fe64eb15f5653038616694c05e"} err="failed to get container status \"2f5b99e8d9e82d0d7858f7781c2fa87d6c90c0fe64eb15f5653038616694c05e\": rpc error: code = NotFound desc = could not find container \"2f5b99e8d9e82d0d7858f7781c2fa87d6c90c0fe64eb15f5653038616694c05e\": container with ID starting with 2f5b99e8d9e82d0d7858f7781c2fa87d6c90c0fe64eb15f5653038616694c05e not found: ID does not exist" Apr 22 16:53:13.852291 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:13.852272 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-xwt5q"] Apr 22 16:53:13.858554 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:13.858519 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-xwt5q"] Apr 22 16:53:14.905530 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:53:14.905496 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a869ea81-bd1b-4b61-9b20-8fbfb8cb2796" path="/var/lib/kubelet/pods/a869ea81-bd1b-4b61-9b20-8fbfb8cb2796/volumes" Apr 22 16:55:38.742940 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:38.742898 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-58786b46b-7vj5g"] Apr 22 16:55:38.743589 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:38.743139 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-58786b46b-7vj5g" podUID="182f6f3f-f045-4acd-ba01-3a4d5b314bb0" containerName="authorino" containerID="cri-o://1b68c22047d8f492ad52e81e8f06a7391a82052f9d5522226aa51c7d41769cd9" gracePeriod=30 Apr 22 16:55:38.987293 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:38.987272 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-58786b46b-7vj5g" Apr 22 16:55:39.077025 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:39.076940 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/182f6f3f-f045-4acd-ba01-3a4d5b314bb0-tls-cert\") pod \"182f6f3f-f045-4acd-ba01-3a4d5b314bb0\" (UID: \"182f6f3f-f045-4acd-ba01-3a4d5b314bb0\") " Apr 22 16:55:39.077025 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:39.076999 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcfjj\" (UniqueName: \"kubernetes.io/projected/182f6f3f-f045-4acd-ba01-3a4d5b314bb0-kube-api-access-kcfjj\") pod \"182f6f3f-f045-4acd-ba01-3a4d5b314bb0\" (UID: \"182f6f3f-f045-4acd-ba01-3a4d5b314bb0\") " Apr 22 16:55:39.079169 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:39.079142 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/182f6f3f-f045-4acd-ba01-3a4d5b314bb0-kube-api-access-kcfjj" (OuterVolumeSpecName: "kube-api-access-kcfjj") pod "182f6f3f-f045-4acd-ba01-3a4d5b314bb0" (UID: "182f6f3f-f045-4acd-ba01-3a4d5b314bb0"). InnerVolumeSpecName "kube-api-access-kcfjj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:55:39.087261 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:39.087237 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/182f6f3f-f045-4acd-ba01-3a4d5b314bb0-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "182f6f3f-f045-4acd-ba01-3a4d5b314bb0" (UID: "182f6f3f-f045-4acd-ba01-3a4d5b314bb0"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:55:39.177668 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:39.177640 2575 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/182f6f3f-f045-4acd-ba01-3a4d5b314bb0-tls-cert\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:55:39.177668 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:39.177665 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kcfjj\" (UniqueName: \"kubernetes.io/projected/182f6f3f-f045-4acd-ba01-3a4d5b314bb0-kube-api-access-kcfjj\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 16:55:39.393474 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:39.393385 2575 generic.go:358] "Generic (PLEG): container finished" podID="182f6f3f-f045-4acd-ba01-3a4d5b314bb0" containerID="1b68c22047d8f492ad52e81e8f06a7391a82052f9d5522226aa51c7d41769cd9" exitCode=0 Apr 22 16:55:39.393474 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:39.393432 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-58786b46b-7vj5g" Apr 22 16:55:39.393474 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:39.393455 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-58786b46b-7vj5g" event={"ID":"182f6f3f-f045-4acd-ba01-3a4d5b314bb0","Type":"ContainerDied","Data":"1b68c22047d8f492ad52e81e8f06a7391a82052f9d5522226aa51c7d41769cd9"} Apr 22 16:55:39.393697 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:39.393480 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-58786b46b-7vj5g" event={"ID":"182f6f3f-f045-4acd-ba01-3a4d5b314bb0","Type":"ContainerDied","Data":"34ed51b50254e82101bb765ea9edaed054dd2768816a7b0f2d6d2c738d584197"} Apr 22 16:55:39.393697 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:39.393496 2575 scope.go:117] "RemoveContainer" containerID="1b68c22047d8f492ad52e81e8f06a7391a82052f9d5522226aa51c7d41769cd9" Apr 22 16:55:39.402901 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:39.402883 2575 scope.go:117] "RemoveContainer" containerID="1b68c22047d8f492ad52e81e8f06a7391a82052f9d5522226aa51c7d41769cd9" Apr 22 16:55:39.403184 ip-10-0-142-238 kubenswrapper[2575]: E0422 16:55:39.403164 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b68c22047d8f492ad52e81e8f06a7391a82052f9d5522226aa51c7d41769cd9\": container with ID starting with 1b68c22047d8f492ad52e81e8f06a7391a82052f9d5522226aa51c7d41769cd9 not found: ID does not exist" containerID="1b68c22047d8f492ad52e81e8f06a7391a82052f9d5522226aa51c7d41769cd9" Apr 22 16:55:39.403251 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:39.403195 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b68c22047d8f492ad52e81e8f06a7391a82052f9d5522226aa51c7d41769cd9"} err="failed to get container status \"1b68c22047d8f492ad52e81e8f06a7391a82052f9d5522226aa51c7d41769cd9\": rpc error: code = NotFound desc = could not find container \"1b68c22047d8f492ad52e81e8f06a7391a82052f9d5522226aa51c7d41769cd9\": container with ID starting with 1b68c22047d8f492ad52e81e8f06a7391a82052f9d5522226aa51c7d41769cd9 not found: ID does not exist" Apr 22 16:55:39.415904 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:39.415879 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-58786b46b-7vj5g"] Apr 22 16:55:39.422252 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:39.422230 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-58786b46b-7vj5g"] Apr 22 16:55:40.905575 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:55:40.905538 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="182f6f3f-f045-4acd-ba01-3a4d5b314bb0" path="/var/lib/kubelet/pods/182f6f3f-f045-4acd-ba01-3a4d5b314bb0/volumes" Apr 22 16:56:40.927379 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:56:40.927269 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:56:40.931386 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:56:40.930470 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 16:57:06.021297 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:06.021261 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-66c9db867c-v9ngf"] Apr 22 16:57:06.021723 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:06.021637 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a869ea81-bd1b-4b61-9b20-8fbfb8cb2796" containerName="authorino" Apr 22 16:57:06.021723 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:06.021649 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a869ea81-bd1b-4b61-9b20-8fbfb8cb2796" containerName="authorino" Apr 22 16:57:06.021723 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:06.021660 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="182f6f3f-f045-4acd-ba01-3a4d5b314bb0" containerName="authorino" Apr 22 16:57:06.021723 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:06.021666 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="182f6f3f-f045-4acd-ba01-3a4d5b314bb0" containerName="authorino" Apr 22 16:57:06.021848 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:06.021740 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="182f6f3f-f045-4acd-ba01-3a4d5b314bb0" containerName="authorino" Apr 22 16:57:06.021848 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:06.021747 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a869ea81-bd1b-4b61-9b20-8fbfb8cb2796" containerName="authorino" Apr 22 16:57:06.024830 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:06.024809 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66c9db867c-v9ngf" Apr 22 16:57:06.028181 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:06.028159 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-p2k84\"" Apr 22 16:57:06.034446 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:06.034415 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-66c9db867c-v9ngf"] Apr 22 16:57:06.045977 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:06.045954 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hlhw\" (UniqueName: \"kubernetes.io/projected/fd6ee9cb-5ab0-4dd6-9784-1bf2530b1e5a-kube-api-access-8hlhw\") pod \"maas-controller-66c9db867c-v9ngf\" (UID: \"fd6ee9cb-5ab0-4dd6-9784-1bf2530b1e5a\") " pod="opendatahub/maas-controller-66c9db867c-v9ngf" Apr 22 16:57:06.146999 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:06.146967 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hlhw\" (UniqueName: \"kubernetes.io/projected/fd6ee9cb-5ab0-4dd6-9784-1bf2530b1e5a-kube-api-access-8hlhw\") pod \"maas-controller-66c9db867c-v9ngf\" (UID: \"fd6ee9cb-5ab0-4dd6-9784-1bf2530b1e5a\") " pod="opendatahub/maas-controller-66c9db867c-v9ngf" Apr 22 16:57:06.157007 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:06.156982 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hlhw\" (UniqueName: \"kubernetes.io/projected/fd6ee9cb-5ab0-4dd6-9784-1bf2530b1e5a-kube-api-access-8hlhw\") pod \"maas-controller-66c9db867c-v9ngf\" (UID: \"fd6ee9cb-5ab0-4dd6-9784-1bf2530b1e5a\") " pod="opendatahub/maas-controller-66c9db867c-v9ngf" Apr 22 16:57:06.335777 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:06.335698 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66c9db867c-v9ngf" Apr 22 16:57:06.461369 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:06.461343 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-66c9db867c-v9ngf"] Apr 22 16:57:06.462653 ip-10-0-142-238 kubenswrapper[2575]: W0422 16:57:06.462626 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd6ee9cb_5ab0_4dd6_9784_1bf2530b1e5a.slice/crio-ddb796427c10c619a41644ffe8a608ba86b23c17c10a182b9d8eb3a6e711f48e WatchSource:0}: Error finding container ddb796427c10c619a41644ffe8a608ba86b23c17c10a182b9d8eb3a6e711f48e: Status 404 returned error can't find the container with id ddb796427c10c619a41644ffe8a608ba86b23c17c10a182b9d8eb3a6e711f48e Apr 22 16:57:06.463873 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:06.463857 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:57:06.747218 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:06.747177 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66c9db867c-v9ngf" event={"ID":"fd6ee9cb-5ab0-4dd6-9784-1bf2530b1e5a","Type":"ContainerStarted","Data":"ddb796427c10c619a41644ffe8a608ba86b23c17c10a182b9d8eb3a6e711f48e"} Apr 22 16:57:09.762206 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:09.762170 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66c9db867c-v9ngf" event={"ID":"fd6ee9cb-5ab0-4dd6-9784-1bf2530b1e5a","Type":"ContainerStarted","Data":"e5d2d06d09baeac52dd58bd2cce6abe573f8f349116974637087b03a1fc44805"} Apr 22 16:57:09.762631 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:09.762251 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-66c9db867c-v9ngf" Apr 22 16:57:09.779922 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:09.779868 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-66c9db867c-v9ngf" podStartSLOduration=1.454158405 podStartE2EDuration="3.779854597s" podCreationTimestamp="2026-04-22 16:57:06 +0000 UTC" firstStartedPulling="2026-04-22 16:57:06.463976898 +0000 UTC m=+2126.165994986" lastFinishedPulling="2026-04-22 16:57:08.789673088 +0000 UTC m=+2128.491691178" observedRunningTime="2026-04-22 16:57:09.776845064 +0000 UTC m=+2129.478863173" watchObservedRunningTime="2026-04-22 16:57:09.779854597 +0000 UTC m=+2129.481872705" Apr 22 16:57:20.772026 ip-10-0-142-238 kubenswrapper[2575]: I0422 16:57:20.771994 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-66c9db867c-v9ngf" Apr 22 17:00:00.148771 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:00.148724 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29614620-c8p8d"] Apr 22 17:00:00.152410 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:00.152386 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614620-c8p8d" Apr 22 17:00:00.155305 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:00.155271 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-bgvcr\"" Apr 22 17:00:00.167786 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:00.167762 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614620-c8p8d"] Apr 22 17:00:00.305364 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:00.305332 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnzc7\" (UniqueName: \"kubernetes.io/projected/4e209442-2c25-401c-834a-4f563e4b813b-kube-api-access-cnzc7\") pod \"maas-api-key-cleanup-29614620-c8p8d\" (UID: \"4e209442-2c25-401c-834a-4f563e4b813b\") " pod="opendatahub/maas-api-key-cleanup-29614620-c8p8d" Apr 22 17:00:00.406452 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:00.406361 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnzc7\" (UniqueName: \"kubernetes.io/projected/4e209442-2c25-401c-834a-4f563e4b813b-kube-api-access-cnzc7\") pod \"maas-api-key-cleanup-29614620-c8p8d\" (UID: \"4e209442-2c25-401c-834a-4f563e4b813b\") " pod="opendatahub/maas-api-key-cleanup-29614620-c8p8d" Apr 22 17:00:00.415401 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:00.415373 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnzc7\" (UniqueName: \"kubernetes.io/projected/4e209442-2c25-401c-834a-4f563e4b813b-kube-api-access-cnzc7\") pod \"maas-api-key-cleanup-29614620-c8p8d\" (UID: \"4e209442-2c25-401c-834a-4f563e4b813b\") " pod="opendatahub/maas-api-key-cleanup-29614620-c8p8d" Apr 22 17:00:00.463000 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:00.462960 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614620-c8p8d" Apr 22 17:00:00.587677 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:00.587650 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614620-c8p8d"] Apr 22 17:00:00.589467 ip-10-0-142-238 kubenswrapper[2575]: W0422 17:00:00.589433 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e209442_2c25_401c_834a_4f563e4b813b.slice/crio-c4fc249fc41f320788f66023d34793bd384a412b4d8040b4cd016bce7a3d98c2 WatchSource:0}: Error finding container c4fc249fc41f320788f66023d34793bd384a412b4d8040b4cd016bce7a3d98c2: Status 404 returned error can't find the container with id c4fc249fc41f320788f66023d34793bd384a412b4d8040b4cd016bce7a3d98c2 Apr 22 17:00:01.440960 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:01.440926 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614620-c8p8d" event={"ID":"4e209442-2c25-401c-834a-4f563e4b813b","Type":"ContainerStarted","Data":"c4fc249fc41f320788f66023d34793bd384a412b4d8040b4cd016bce7a3d98c2"} Apr 22 17:00:03.450488 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:03.450454 2575 generic.go:358] "Generic (PLEG): container finished" podID="4e209442-2c25-401c-834a-4f563e4b813b" containerID="757faedf3b090d26fe59df24904193aade6b7157fb891f5b7d4dfe641af18bd2" exitCode=0 Apr 22 17:00:03.450839 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:03.450496 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614620-c8p8d" event={"ID":"4e209442-2c25-401c-834a-4f563e4b813b","Type":"ContainerDied","Data":"757faedf3b090d26fe59df24904193aade6b7157fb891f5b7d4dfe641af18bd2"} Apr 22 17:00:04.580368 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:04.580342 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614620-c8p8d" Apr 22 17:00:04.647849 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:04.647815 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnzc7\" (UniqueName: \"kubernetes.io/projected/4e209442-2c25-401c-834a-4f563e4b813b-kube-api-access-cnzc7\") pod \"4e209442-2c25-401c-834a-4f563e4b813b\" (UID: \"4e209442-2c25-401c-834a-4f563e4b813b\") " Apr 22 17:00:04.650205 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:04.650173 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e209442-2c25-401c-834a-4f563e4b813b-kube-api-access-cnzc7" (OuterVolumeSpecName: "kube-api-access-cnzc7") pod "4e209442-2c25-401c-834a-4f563e4b813b" (UID: "4e209442-2c25-401c-834a-4f563e4b813b"). InnerVolumeSpecName "kube-api-access-cnzc7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:00:04.749634 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:04.749535 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cnzc7\" (UniqueName: \"kubernetes.io/projected/4e209442-2c25-401c-834a-4f563e4b813b-kube-api-access-cnzc7\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 17:00:05.461433 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:05.461397 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614620-c8p8d" event={"ID":"4e209442-2c25-401c-834a-4f563e4b813b","Type":"ContainerDied","Data":"c4fc249fc41f320788f66023d34793bd384a412b4d8040b4cd016bce7a3d98c2"} Apr 22 17:00:05.461433 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:05.461418 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614620-c8p8d" Apr 22 17:00:05.461433 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:00:05.461431 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4fc249fc41f320788f66023d34793bd384a412b4d8040b4cd016bce7a3d98c2" Apr 22 17:01:40.973491 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:01:40.973460 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 17:01:40.977327 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:01:40.977308 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 17:06:41.007720 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:06:41.007604 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 17:06:41.012258 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:06:41.012241 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 17:07:24.704363 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:24.704318 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv"] Apr 22 17:07:24.704886 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:24.704658 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" podUID="603e9f8f-75a1-4d5b-8d68-44f5577361c5" containerName="manager" containerID="cri-o://22f6f9f395218acddb07bc06e935e57999990e984858e1005bfc99c5b02678e9" gracePeriod=10 Apr 22 17:07:25.060864 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:25.060837 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" Apr 22 17:07:25.194608 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:25.194571 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/603e9f8f-75a1-4d5b-8d68-44f5577361c5-extensions-socket-volume\") pod \"603e9f8f-75a1-4d5b-8d68-44f5577361c5\" (UID: \"603e9f8f-75a1-4d5b-8d68-44f5577361c5\") " Apr 22 17:07:25.194608 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:25.194608 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xh29\" (UniqueName: \"kubernetes.io/projected/603e9f8f-75a1-4d5b-8d68-44f5577361c5-kube-api-access-9xh29\") pod \"603e9f8f-75a1-4d5b-8d68-44f5577361c5\" (UID: \"603e9f8f-75a1-4d5b-8d68-44f5577361c5\") " Apr 22 17:07:25.194956 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:25.194930 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/603e9f8f-75a1-4d5b-8d68-44f5577361c5-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "603e9f8f-75a1-4d5b-8d68-44f5577361c5" (UID: "603e9f8f-75a1-4d5b-8d68-44f5577361c5"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:07:25.196978 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:25.196953 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/603e9f8f-75a1-4d5b-8d68-44f5577361c5-kube-api-access-9xh29" (OuterVolumeSpecName: "kube-api-access-9xh29") pod "603e9f8f-75a1-4d5b-8d68-44f5577361c5" (UID: "603e9f8f-75a1-4d5b-8d68-44f5577361c5"). InnerVolumeSpecName "kube-api-access-9xh29". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:07:25.219236 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:25.219201 2575 generic.go:358] "Generic (PLEG): container finished" podID="603e9f8f-75a1-4d5b-8d68-44f5577361c5" containerID="22f6f9f395218acddb07bc06e935e57999990e984858e1005bfc99c5b02678e9" exitCode=0 Apr 22 17:07:25.219413 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:25.219261 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" Apr 22 17:07:25.219413 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:25.219287 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" event={"ID":"603e9f8f-75a1-4d5b-8d68-44f5577361c5","Type":"ContainerDied","Data":"22f6f9f395218acddb07bc06e935e57999990e984858e1005bfc99c5b02678e9"} Apr 22 17:07:25.219413 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:25.219330 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv" event={"ID":"603e9f8f-75a1-4d5b-8d68-44f5577361c5","Type":"ContainerDied","Data":"8e1f7357ed495cda2879c78544edec30271053a45f0cf1168c6e41838ef41dca"} Apr 22 17:07:25.219413 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:25.219347 2575 scope.go:117] "RemoveContainer" containerID="22f6f9f395218acddb07bc06e935e57999990e984858e1005bfc99c5b02678e9" Apr 22 17:07:25.230240 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:25.230220 2575 scope.go:117] "RemoveContainer" containerID="22f6f9f395218acddb07bc06e935e57999990e984858e1005bfc99c5b02678e9" Apr 22 17:07:25.230565 ip-10-0-142-238 kubenswrapper[2575]: E0422 17:07:25.230546 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22f6f9f395218acddb07bc06e935e57999990e984858e1005bfc99c5b02678e9\": container with ID starting with 22f6f9f395218acddb07bc06e935e57999990e984858e1005bfc99c5b02678e9 not found: ID does not exist" containerID="22f6f9f395218acddb07bc06e935e57999990e984858e1005bfc99c5b02678e9" Apr 22 17:07:25.230611 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:25.230576 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22f6f9f395218acddb07bc06e935e57999990e984858e1005bfc99c5b02678e9"} err="failed to get container status \"22f6f9f395218acddb07bc06e935e57999990e984858e1005bfc99c5b02678e9\": rpc error: code = NotFound desc = could not find container \"22f6f9f395218acddb07bc06e935e57999990e984858e1005bfc99c5b02678e9\": container with ID starting with 22f6f9f395218acddb07bc06e935e57999990e984858e1005bfc99c5b02678e9 not found: ID does not exist" Apr 22 17:07:25.243303 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:25.243260 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv"] Apr 22 17:07:25.247345 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:25.247313 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wwdgv"] Apr 22 17:07:25.295721 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:25.295683 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/603e9f8f-75a1-4d5b-8d68-44f5577361c5-extensions-socket-volume\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 17:07:25.295721 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:25.295715 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9xh29\" (UniqueName: \"kubernetes.io/projected/603e9f8f-75a1-4d5b-8d68-44f5577361c5-kube-api-access-9xh29\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 17:07:26.905797 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:07:26.905758 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="603e9f8f-75a1-4d5b-8d68-44f5577361c5" path="/var/lib/kubelet/pods/603e9f8f-75a1-4d5b-8d68-44f5577361c5/volumes" Apr 22 17:08:30.790126 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:30.790092 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2r4l9"] Apr 22 17:08:30.790577 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:30.790480 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e209442-2c25-401c-834a-4f563e4b813b" containerName="cleanup" Apr 22 17:08:30.790577 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:30.790492 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e209442-2c25-401c-834a-4f563e4b813b" containerName="cleanup" Apr 22 17:08:30.790577 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:30.790498 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="603e9f8f-75a1-4d5b-8d68-44f5577361c5" containerName="manager" Apr 22 17:08:30.790577 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:30.790504 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="603e9f8f-75a1-4d5b-8d68-44f5577361c5" containerName="manager" Apr 22 17:08:30.790577 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:30.790577 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e209442-2c25-401c-834a-4f563e4b813b" containerName="cleanup" Apr 22 17:08:30.790742 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:30.790585 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="603e9f8f-75a1-4d5b-8d68-44f5577361c5" containerName="manager" Apr 22 17:08:30.793748 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:30.793731 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2r4l9" Apr 22 17:08:30.796726 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:30.796709 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-75bhj\"" Apr 22 17:08:30.803289 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:30.803262 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2r4l9"] Apr 22 17:08:30.981715 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:30.981673 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6znz\" (UniqueName: \"kubernetes.io/projected/b4fdc4e3-ca24-40dd-8f3c-0c1094c85c0b-kube-api-access-x6znz\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2r4l9\" (UID: \"b4fdc4e3-ca24-40dd-8f3c-0c1094c85c0b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2r4l9" Apr 22 17:08:30.981913 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:30.981727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b4fdc4e3-ca24-40dd-8f3c-0c1094c85c0b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2r4l9\" (UID: \"b4fdc4e3-ca24-40dd-8f3c-0c1094c85c0b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2r4l9" Apr 22 17:08:31.082559 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:31.082467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6znz\" (UniqueName: \"kubernetes.io/projected/b4fdc4e3-ca24-40dd-8f3c-0c1094c85c0b-kube-api-access-x6znz\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2r4l9\" (UID: \"b4fdc4e3-ca24-40dd-8f3c-0c1094c85c0b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2r4l9" Apr 22 17:08:31.082559 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:31.082518 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b4fdc4e3-ca24-40dd-8f3c-0c1094c85c0b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2r4l9\" (UID: \"b4fdc4e3-ca24-40dd-8f3c-0c1094c85c0b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2r4l9" Apr 22 17:08:31.082989 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:31.082966 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b4fdc4e3-ca24-40dd-8f3c-0c1094c85c0b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2r4l9\" (UID: \"b4fdc4e3-ca24-40dd-8f3c-0c1094c85c0b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2r4l9" Apr 22 17:08:31.093670 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:31.093637 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6znz\" (UniqueName: \"kubernetes.io/projected/b4fdc4e3-ca24-40dd-8f3c-0c1094c85c0b-kube-api-access-x6znz\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2r4l9\" (UID: \"b4fdc4e3-ca24-40dd-8f3c-0c1094c85c0b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2r4l9" Apr 22 17:08:31.104630 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:31.104603 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2r4l9" Apr 22 17:08:31.236638 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:31.236611 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2r4l9"] Apr 22 17:08:31.237990 ip-10-0-142-238 kubenswrapper[2575]: W0422 17:08:31.237960 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4fdc4e3_ca24_40dd_8f3c_0c1094c85c0b.slice/crio-05615d393ec31b9c6f4634b54460373940e216e8bd67bc09d6f480ad62be7923 WatchSource:0}: Error finding container 05615d393ec31b9c6f4634b54460373940e216e8bd67bc09d6f480ad62be7923: Status 404 returned error can't find the container with id 05615d393ec31b9c6f4634b54460373940e216e8bd67bc09d6f480ad62be7923 Apr 22 17:08:31.240349 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:31.240332 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:08:31.485417 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:31.485380 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2r4l9" event={"ID":"b4fdc4e3-ca24-40dd-8f3c-0c1094c85c0b","Type":"ContainerStarted","Data":"2601ed9fe15133ba82c7e123f929e23970ce5eb4961ec9217ac981ca22ac8738"} Apr 22 17:08:31.485417 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:31.485419 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2r4l9" event={"ID":"b4fdc4e3-ca24-40dd-8f3c-0c1094c85c0b","Type":"ContainerStarted","Data":"05615d393ec31b9c6f4634b54460373940e216e8bd67bc09d6f480ad62be7923"} Apr 22 17:08:31.485643 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:31.485488 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2r4l9" Apr 22 17:08:31.504936 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:31.504890 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2r4l9" podStartSLOduration=1.5048775 podStartE2EDuration="1.5048775s" podCreationTimestamp="2026-04-22 17:08:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:08:31.501168995 +0000 UTC m=+2811.203187105" watchObservedRunningTime="2026-04-22 17:08:31.5048775 +0000 UTC m=+2811.206895681" Apr 22 17:08:42.491863 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:08:42.491825 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2r4l9" Apr 22 17:11:41.041195 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:11:41.041093 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 17:11:41.046845 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:11:41.046826 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 17:15:00.160470 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:00.160432 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29614635-c6w89"] Apr 22 17:15:00.163955 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:00.163936 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614635-c6w89" Apr 22 17:15:00.168449 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:00.168429 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-bgvcr\"" Apr 22 17:15:00.190771 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:00.190745 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614635-c6w89"] Apr 22 17:15:00.250223 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:00.250183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skttr\" (UniqueName: \"kubernetes.io/projected/83e54559-d970-4b1f-b0b4-b6a2d4b314cd-kube-api-access-skttr\") pod \"maas-api-key-cleanup-29614635-c6w89\" (UID: \"83e54559-d970-4b1f-b0b4-b6a2d4b314cd\") " pod="opendatahub/maas-api-key-cleanup-29614635-c6w89" Apr 22 17:15:00.351523 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:00.351484 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skttr\" (UniqueName: \"kubernetes.io/projected/83e54559-d970-4b1f-b0b4-b6a2d4b314cd-kube-api-access-skttr\") pod \"maas-api-key-cleanup-29614635-c6w89\" (UID: \"83e54559-d970-4b1f-b0b4-b6a2d4b314cd\") " pod="opendatahub/maas-api-key-cleanup-29614635-c6w89" Apr 22 17:15:00.359694 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:00.359666 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skttr\" (UniqueName: \"kubernetes.io/projected/83e54559-d970-4b1f-b0b4-b6a2d4b314cd-kube-api-access-skttr\") pod \"maas-api-key-cleanup-29614635-c6w89\" (UID: \"83e54559-d970-4b1f-b0b4-b6a2d4b314cd\") " pod="opendatahub/maas-api-key-cleanup-29614635-c6w89" Apr 22 17:15:00.474031 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:00.473992 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614635-c6w89" Apr 22 17:15:00.598203 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:00.598181 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614635-c6w89"] Apr 22 17:15:00.600355 ip-10-0-142-238 kubenswrapper[2575]: W0422 17:15:00.600327 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83e54559_d970_4b1f_b0b4_b6a2d4b314cd.slice/crio-f65c18dba67990932c05ac889eb3438913aede2a27901f670bc540cfbb57410f WatchSource:0}: Error finding container f65c18dba67990932c05ac889eb3438913aede2a27901f670bc540cfbb57410f: Status 404 returned error can't find the container with id f65c18dba67990932c05ac889eb3438913aede2a27901f670bc540cfbb57410f Apr 22 17:15:00.602090 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:00.602074 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:15:01.044361 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:01.044273 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614635-c6w89" event={"ID":"83e54559-d970-4b1f-b0b4-b6a2d4b314cd","Type":"ContainerStarted","Data":"ce13d82a05d7dba35110ef5ac64f9cc26ae8d9f1f6a5e80e3706d686680ac26e"} Apr 22 17:15:01.044361 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:01.044315 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614635-c6w89" event={"ID":"83e54559-d970-4b1f-b0b4-b6a2d4b314cd","Type":"ContainerStarted","Data":"f65c18dba67990932c05ac889eb3438913aede2a27901f670bc540cfbb57410f"} Apr 22 17:15:01.061542 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:01.061500 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29614635-c6w89" podStartSLOduration=1.061485966 podStartE2EDuration="1.061485966s" podCreationTimestamp="2026-04-22 17:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:15:01.060252456 +0000 UTC m=+3200.762270567" watchObservedRunningTime="2026-04-22 17:15:01.061485966 +0000 UTC m=+3200.763504074" Apr 22 17:15:02.049302 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:02.049271 2575 generic.go:358] "Generic (PLEG): container finished" podID="83e54559-d970-4b1f-b0b4-b6a2d4b314cd" containerID="ce13d82a05d7dba35110ef5ac64f9cc26ae8d9f1f6a5e80e3706d686680ac26e" exitCode=0 Apr 22 17:15:02.049684 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:02.049326 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614635-c6w89" event={"ID":"83e54559-d970-4b1f-b0b4-b6a2d4b314cd","Type":"ContainerDied","Data":"ce13d82a05d7dba35110ef5ac64f9cc26ae8d9f1f6a5e80e3706d686680ac26e"} Apr 22 17:15:03.187375 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:03.187338 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614635-c6w89" Apr 22 17:15:03.279819 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:03.279785 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skttr\" (UniqueName: \"kubernetes.io/projected/83e54559-d970-4b1f-b0b4-b6a2d4b314cd-kube-api-access-skttr\") pod \"83e54559-d970-4b1f-b0b4-b6a2d4b314cd\" (UID: \"83e54559-d970-4b1f-b0b4-b6a2d4b314cd\") " Apr 22 17:15:03.281893 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:03.281870 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83e54559-d970-4b1f-b0b4-b6a2d4b314cd-kube-api-access-skttr" (OuterVolumeSpecName: "kube-api-access-skttr") pod "83e54559-d970-4b1f-b0b4-b6a2d4b314cd" (UID: "83e54559-d970-4b1f-b0b4-b6a2d4b314cd"). InnerVolumeSpecName "kube-api-access-skttr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:15:03.380877 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:03.380802 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-skttr\" (UniqueName: \"kubernetes.io/projected/83e54559-d970-4b1f-b0b4-b6a2d4b314cd-kube-api-access-skttr\") on node \"ip-10-0-142-238.ec2.internal\" DevicePath \"\"" Apr 22 17:15:04.058612 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:04.058571 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614635-c6w89" event={"ID":"83e54559-d970-4b1f-b0b4-b6a2d4b314cd","Type":"ContainerDied","Data":"f65c18dba67990932c05ac889eb3438913aede2a27901f670bc540cfbb57410f"} Apr 22 17:15:04.058612 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:04.058604 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614635-c6w89" Apr 22 17:15:04.058612 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:15:04.058611 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f65c18dba67990932c05ac889eb3438913aede2a27901f670bc540cfbb57410f" Apr 22 17:16:41.083656 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:16:41.083556 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 17:16:41.090493 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:16:41.090471 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 17:18:08.643817 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:08.643777 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-key-cleanup-29614620-c8p8d_4e209442-2c25-401c-834a-4f563e4b813b/cleanup/0.log" Apr 22 17:18:08.748772 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:08.748743 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-key-cleanup-29614635-c6w89_83e54559-d970-4b1f-b0b4-b6a2d4b314cd/cleanup/0.log" Apr 22 17:18:08.856524 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:08.856486 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-66c9db867c-v9ngf_fd6ee9cb-5ab0-4dd6-9784-1bf2530b1e5a/manager/0.log" Apr 22 17:18:09.075762 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:09.075712 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-57c8d5d679-b8pdc_c3435da7-2b7c-47d4-b6f4-dc09140dd90e/manager/0.log" Apr 22 17:18:10.135671 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:10.135636 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x_18d8b17a-1897-4266-9555-e6c7d2803a15/util/0.log" Apr 22 17:18:10.141772 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:10.141749 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x_18d8b17a-1897-4266-9555-e6c7d2803a15/pull/0.log" Apr 22 17:18:10.147169 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:10.147148 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x_18d8b17a-1897-4266-9555-e6c7d2803a15/extract/0.log" Apr 22 17:18:10.249565 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:10.249538 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k_9a6b497b-a168-4b3d-b259-07fc47a07416/util/0.log" Apr 22 17:18:10.255312 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:10.255295 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k_9a6b497b-a168-4b3d-b259-07fc47a07416/pull/0.log" Apr 22 17:18:10.260215 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:10.260198 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k_9a6b497b-a168-4b3d-b259-07fc47a07416/extract/0.log" Apr 22 17:18:10.364304 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:10.364276 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v_590bf948-b178-4415-b16b-a6339ab50c2c/util/0.log" Apr 22 17:18:10.369707 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:10.369685 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v_590bf948-b178-4415-b16b-a6339ab50c2c/pull/0.log" Apr 22 17:18:10.376149 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:10.376133 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v_590bf948-b178-4415-b16b-a6339ab50c2c/extract/0.log" Apr 22 17:18:10.478796 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:10.478778 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr_8c5454eb-b854-441d-a6c7-44481e739f60/util/0.log" Apr 22 17:18:10.485610 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:10.485580 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr_8c5454eb-b854-441d-a6c7-44481e739f60/pull/0.log" Apr 22 17:18:10.491810 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:10.491789 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr_8c5454eb-b854-441d-a6c7-44481e739f60/extract/0.log" Apr 22 17:18:11.036287 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:11.036247 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-7fpqw_e827b06d-94a9-4295-a945-985a14cbb42a/registry-server/0.log" Apr 22 17:18:11.152727 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:11.152702 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-2r4l9_b4fdc4e3-ca24-40dd-8f3c-0c1094c85c0b/manager/0.log" Apr 22 17:18:11.374386 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:11.374274 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-xjmf8_ab8ddab0-f0c5-4474-8120-fa85e91529c2/manager/0.log" Apr 22 17:18:11.705355 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:11.705325 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fzjdfn_4c0a40a6-c1b0-4337-a99a-e73afe30deb9/istio-proxy/0.log" Apr 22 17:18:12.145535 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:12.145462 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-zxbs5_ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb/istio-proxy/0.log" Apr 22 17:18:12.259100 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:12.259072 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6db85cc586-wptmg_de8899ee-ffb9-447d-bfe6-3f4560af10d3/router/0.log" Apr 22 17:18:16.877443 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:16.877406 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lqwg2/must-gather-w6sj2"] Apr 22 17:18:16.877802 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:16.877792 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83e54559-d970-4b1f-b0b4-b6a2d4b314cd" containerName="cleanup" Apr 22 17:18:16.877844 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:16.877804 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e54559-d970-4b1f-b0b4-b6a2d4b314cd" containerName="cleanup" Apr 22 17:18:16.877878 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:16.877873 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="83e54559-d970-4b1f-b0b4-b6a2d4b314cd" containerName="cleanup" Apr 22 17:18:16.881175 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:16.881157 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqwg2/must-gather-w6sj2" Apr 22 17:18:16.883713 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:16.883689 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lqwg2\"/\"openshift-service-ca.crt\"" Apr 22 17:18:16.883829 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:16.883734 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-lqwg2\"/\"default-dockercfg-w4hxb\"" Apr 22 17:18:16.884857 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:16.884839 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lqwg2\"/\"kube-root-ca.crt\"" Apr 22 17:18:16.899769 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:16.899747 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lqwg2/must-gather-w6sj2"] Apr 22 17:18:17.015582 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:17.015544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f615ac5d-77e5-4e45-bcde-b88b51ec288d-must-gather-output\") pod \"must-gather-w6sj2\" (UID: \"f615ac5d-77e5-4e45-bcde-b88b51ec288d\") " pod="openshift-must-gather-lqwg2/must-gather-w6sj2" Apr 22 17:18:17.015730 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:17.015613 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ftf7\" (UniqueName: \"kubernetes.io/projected/f615ac5d-77e5-4e45-bcde-b88b51ec288d-kube-api-access-4ftf7\") pod \"must-gather-w6sj2\" (UID: \"f615ac5d-77e5-4e45-bcde-b88b51ec288d\") " pod="openshift-must-gather-lqwg2/must-gather-w6sj2" Apr 22 17:18:17.117218 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:17.117182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f615ac5d-77e5-4e45-bcde-b88b51ec288d-must-gather-output\") pod \"must-gather-w6sj2\" (UID: \"f615ac5d-77e5-4e45-bcde-b88b51ec288d\") " pod="openshift-must-gather-lqwg2/must-gather-w6sj2" Apr 22 17:18:17.117409 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:17.117248 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ftf7\" (UniqueName: \"kubernetes.io/projected/f615ac5d-77e5-4e45-bcde-b88b51ec288d-kube-api-access-4ftf7\") pod \"must-gather-w6sj2\" (UID: \"f615ac5d-77e5-4e45-bcde-b88b51ec288d\") " pod="openshift-must-gather-lqwg2/must-gather-w6sj2" Apr 22 17:18:17.117594 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:17.117567 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f615ac5d-77e5-4e45-bcde-b88b51ec288d-must-gather-output\") pod \"must-gather-w6sj2\" (UID: \"f615ac5d-77e5-4e45-bcde-b88b51ec288d\") " pod="openshift-must-gather-lqwg2/must-gather-w6sj2" Apr 22 17:18:17.127219 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:17.127184 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ftf7\" (UniqueName: \"kubernetes.io/projected/f615ac5d-77e5-4e45-bcde-b88b51ec288d-kube-api-access-4ftf7\") pod \"must-gather-w6sj2\" (UID: \"f615ac5d-77e5-4e45-bcde-b88b51ec288d\") " pod="openshift-must-gather-lqwg2/must-gather-w6sj2" Apr 22 17:18:17.190987 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:17.190955 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqwg2/must-gather-w6sj2" Apr 22 17:18:17.543885 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:17.543860 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lqwg2/must-gather-w6sj2"] Apr 22 17:18:17.545994 ip-10-0-142-238 kubenswrapper[2575]: W0422 17:18:17.545967 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf615ac5d_77e5_4e45_bcde_b88b51ec288d.slice/crio-83fed06da1e0b8f1ff3bc3769b032b8f7129bf03453b3f0f6e8665eedeef16f0 WatchSource:0}: Error finding container 83fed06da1e0b8f1ff3bc3769b032b8f7129bf03453b3f0f6e8665eedeef16f0: Status 404 returned error can't find the container with id 83fed06da1e0b8f1ff3bc3769b032b8f7129bf03453b3f0f6e8665eedeef16f0 Apr 22 17:18:17.854473 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:17.854391 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqwg2/must-gather-w6sj2" event={"ID":"f615ac5d-77e5-4e45-bcde-b88b51ec288d","Type":"ContainerStarted","Data":"83fed06da1e0b8f1ff3bc3769b032b8f7129bf03453b3f0f6e8665eedeef16f0"} Apr 22 17:18:18.860581 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:18.860538 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqwg2/must-gather-w6sj2" event={"ID":"f615ac5d-77e5-4e45-bcde-b88b51ec288d","Type":"ContainerStarted","Data":"9337062fc6f9d19a05c076b4ead31f08ad25a4a249d7945181aa308876e133d7"} Apr 22 17:18:18.860581 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:18.860581 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqwg2/must-gather-w6sj2" event={"ID":"f615ac5d-77e5-4e45-bcde-b88b51ec288d","Type":"ContainerStarted","Data":"596324d359acbec759d4d986377ea7f192a4acecabfa7e0a228e212c46d2c385"} Apr 22 17:18:18.876055 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:18.875988 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lqwg2/must-gather-w6sj2" podStartSLOduration=1.8989171059999999 podStartE2EDuration="2.875969987s" podCreationTimestamp="2026-04-22 17:18:16 +0000 UTC" firstStartedPulling="2026-04-22 17:18:17.54794044 +0000 UTC m=+3397.249958531" lastFinishedPulling="2026-04-22 17:18:18.524993321 +0000 UTC m=+3398.227011412" observedRunningTime="2026-04-22 17:18:18.875180333 +0000 UTC m=+3398.577198440" watchObservedRunningTime="2026-04-22 17:18:18.875969987 +0000 UTC m=+3398.577988096" Apr 22 17:18:20.073169 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:20.073137 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-mxnhl_2a362d49-91bf-4ec1-b686-5a8676288536/global-pull-secret-syncer/0.log" Apr 22 17:18:20.169806 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:20.169778 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4c4gx_df5633bf-88a8-430f-b582-1e8a7a03005c/konnectivity-agent/0.log" Apr 22 17:18:20.313557 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:20.313519 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-238.ec2.internal_cce540b655be06a7bb42ab4ffff03c36/haproxy/0.log" Apr 22 17:18:23.897731 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:23.897690 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x_18d8b17a-1897-4266-9555-e6c7d2803a15/extract/0.log" Apr 22 17:18:23.918835 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:23.918802 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x_18d8b17a-1897-4266-9555-e6c7d2803a15/util/0.log" Apr 22 17:18:23.947794 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:23.947767 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595qp4x_18d8b17a-1897-4266-9555-e6c7d2803a15/pull/0.log" Apr 22 17:18:23.984700 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:23.984652 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k_9a6b497b-a168-4b3d-b259-07fc47a07416/extract/0.log" Apr 22 17:18:24.011961 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:24.011923 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k_9a6b497b-a168-4b3d-b259-07fc47a07416/util/0.log" Apr 22 17:18:24.041836 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:24.041806 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qxk8k_9a6b497b-a168-4b3d-b259-07fc47a07416/pull/0.log" Apr 22 17:18:24.074594 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:24.074562 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v_590bf948-b178-4415-b16b-a6339ab50c2c/extract/0.log" Apr 22 17:18:24.103270 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:24.103236 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v_590bf948-b178-4415-b16b-a6339ab50c2c/util/0.log" Apr 22 17:18:24.133388 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:24.133352 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7356x6v_590bf948-b178-4415-b16b-a6339ab50c2c/pull/0.log" Apr 22 17:18:24.160920 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:24.160841 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr_8c5454eb-b854-441d-a6c7-44481e739f60/extract/0.log" Apr 22 17:18:24.190705 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:24.190671 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr_8c5454eb-b854-441d-a6c7-44481e739f60/util/0.log" Apr 22 17:18:24.221507 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:24.221476 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1fcdxr_8c5454eb-b854-441d-a6c7-44481e739f60/pull/0.log" Apr 22 17:18:24.535341 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:24.535306 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-7fpqw_e827b06d-94a9-4295-a945-985a14cbb42a/registry-server/0.log" Apr 22 17:18:24.618877 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:24.618841 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-2r4l9_b4fdc4e3-ca24-40dd-8f3c-0c1094c85c0b/manager/0.log" Apr 22 17:18:24.721290 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:24.721258 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-xjmf8_ab8ddab0-f0c5-4474-8120-fa85e91529c2/manager/0.log" Apr 22 17:18:26.110935 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:26.110896 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_02918e16-1e6c-44cd-9fc0-2e3caac620b6/alertmanager/0.log" Apr 22 17:18:26.141188 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:26.141154 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_02918e16-1e6c-44cd-9fc0-2e3caac620b6/config-reloader/0.log" Apr 22 17:18:26.166064 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:26.165790 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_02918e16-1e6c-44cd-9fc0-2e3caac620b6/kube-rbac-proxy-web/0.log" Apr 22 17:18:26.202601 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:26.202570 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_02918e16-1e6c-44cd-9fc0-2e3caac620b6/kube-rbac-proxy/0.log" Apr 22 17:18:26.230027 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:26.229985 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_02918e16-1e6c-44cd-9fc0-2e3caac620b6/kube-rbac-proxy-metric/0.log" Apr 22 17:18:26.253960 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:26.253934 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_02918e16-1e6c-44cd-9fc0-2e3caac620b6/prom-label-proxy/0.log" Apr 22 17:18:26.277375 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:26.277349 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_02918e16-1e6c-44cd-9fc0-2e3caac620b6/init-config-reloader/0.log" Apr 22 17:18:26.355234 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:26.355118 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9mtcp_42540c61-86c6-4b6b-9abc-a3e2fd297a0d/kube-state-metrics/0.log" Apr 22 17:18:26.376059 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:26.376013 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9mtcp_42540c61-86c6-4b6b-9abc-a3e2fd297a0d/kube-rbac-proxy-main/0.log" Apr 22 17:18:26.410608 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:26.410025 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9mtcp_42540c61-86c6-4b6b-9abc-a3e2fd297a0d/kube-rbac-proxy-self/0.log" Apr 22 17:18:26.680653 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:26.680569 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zcsck_2d29fc65-9864-4e36-bfb7-97c820068952/node-exporter/0.log" Apr 22 17:18:26.707317 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:26.707289 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zcsck_2d29fc65-9864-4e36-bfb7-97c820068952/kube-rbac-proxy/0.log" Apr 22 17:18:26.737442 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:26.737409 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zcsck_2d29fc65-9864-4e36-bfb7-97c820068952/init-textfile/0.log" Apr 22 17:18:27.080537 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:27.080499 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-z6p99_dd347c8d-dbe5-4f7e-90f0-0fb9180c29c4/prometheus-operator-admission-webhook/0.log" Apr 22 17:18:27.109980 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:27.109954 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-54f4944c8f-wns2w_69e0156f-0f0d-42db-8022-313f141d4dc1/telemeter-client/0.log" Apr 22 17:18:27.131247 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:27.131218 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-54f4944c8f-wns2w_69e0156f-0f0d-42db-8022-313f141d4dc1/reload/0.log" Apr 22 17:18:27.151865 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:27.151811 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-54f4944c8f-wns2w_69e0156f-0f0d-42db-8022-313f141d4dc1/kube-rbac-proxy/0.log" Apr 22 17:18:27.180825 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:27.180789 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-78c87765bc-6xw5b_a81b8fe4-04a8-4d2e-8da1-c17517d508db/thanos-query/0.log" Apr 22 17:18:27.201059 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:27.201015 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-78c87765bc-6xw5b_a81b8fe4-04a8-4d2e-8da1-c17517d508db/kube-rbac-proxy-web/0.log" Apr 22 17:18:27.232698 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:27.232667 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-78c87765bc-6xw5b_a81b8fe4-04a8-4d2e-8da1-c17517d508db/kube-rbac-proxy/0.log" Apr 22 17:18:27.249687 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:27.249658 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-78c87765bc-6xw5b_a81b8fe4-04a8-4d2e-8da1-c17517d508db/prom-label-proxy/0.log" Apr 22 17:18:27.269582 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:27.269558 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-78c87765bc-6xw5b_a81b8fe4-04a8-4d2e-8da1-c17517d508db/kube-rbac-proxy-rules/0.log" Apr 22 17:18:27.288922 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:27.288897 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-78c87765bc-6xw5b_a81b8fe4-04a8-4d2e-8da1-c17517d508db/kube-rbac-proxy-metrics/0.log" Apr 22 17:18:29.071208 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.071176 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc"] Apr 22 17:18:29.078141 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.078113 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.084579 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.084547 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc"] Apr 22 17:18:29.249426 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.249390 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927-lib-modules\") pod \"perf-node-gather-daemonset-48tdc\" (UID: \"d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.249624 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.249438 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x52sl\" (UniqueName: \"kubernetes.io/projected/d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927-kube-api-access-x52sl\") pod \"perf-node-gather-daemonset-48tdc\" (UID: \"d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.249624 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.249469 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927-sys\") pod \"perf-node-gather-daemonset-48tdc\" (UID: \"d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.249624 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.249499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927-proc\") pod \"perf-node-gather-daemonset-48tdc\" (UID: \"d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.249768 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.249657 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927-podres\") pod \"perf-node-gather-daemonset-48tdc\" (UID: \"d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.351079 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.350956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927-podres\") pod \"perf-node-gather-daemonset-48tdc\" (UID: \"d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.351079 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.351032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927-lib-modules\") pod \"perf-node-gather-daemonset-48tdc\" (UID: \"d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.351306 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.351085 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x52sl\" (UniqueName: \"kubernetes.io/projected/d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927-kube-api-access-x52sl\") pod \"perf-node-gather-daemonset-48tdc\" (UID: \"d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.351306 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.351131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927-sys\") pod \"perf-node-gather-daemonset-48tdc\" (UID: \"d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.351306 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.351175 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927-podres\") pod \"perf-node-gather-daemonset-48tdc\" (UID: \"d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.351306 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.351184 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927-proc\") pod \"perf-node-gather-daemonset-48tdc\" (UID: \"d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.351306 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.351246 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927-proc\") pod \"perf-node-gather-daemonset-48tdc\" (UID: \"d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.351552 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.351313 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927-sys\") pod \"perf-node-gather-daemonset-48tdc\" (UID: \"d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.351552 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.351432 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927-lib-modules\") pod \"perf-node-gather-daemonset-48tdc\" (UID: \"d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.357699 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.357670 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-884fb454c-25glw_d2ec5175-61f0-4b9a-b8ae-64af4003cbf9/console/0.log" Apr 22 17:18:29.360330 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.360302 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x52sl\" (UniqueName: \"kubernetes.io/projected/d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927-kube-api-access-x52sl\") pod \"perf-node-gather-daemonset-48tdc\" (UID: \"d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.389105 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.389071 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.555766 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.555729 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc"] Apr 22 17:18:29.558999 ip-10-0-142-238 kubenswrapper[2575]: W0422 17:18:29.558970 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd1bd2cd2_b1d0_4081_ad4a_560d7f6b6927.slice/crio-56ff18e37c6d181bddbc6d89f7ca89373203eb3dfa3cf8bed8d1295b06e0d0be WatchSource:0}: Error finding container 56ff18e37c6d181bddbc6d89f7ca89373203eb3dfa3cf8bed8d1295b06e0d0be: Status 404 returned error can't find the container with id 56ff18e37c6d181bddbc6d89f7ca89373203eb3dfa3cf8bed8d1295b06e0d0be Apr 22 17:18:29.924229 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.924130 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" event={"ID":"d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927","Type":"ContainerStarted","Data":"665af1a988add8b09d7cd744dcc204497359cdb847a1d9454c4ec437b613a425"} Apr 22 17:18:29.924229 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.924178 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" event={"ID":"d1bd2cd2-b1d0-4081-ad4a-560d7f6b6927","Type":"ContainerStarted","Data":"56ff18e37c6d181bddbc6d89f7ca89373203eb3dfa3cf8bed8d1295b06e0d0be"} Apr 22 17:18:29.924465 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.924247 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:29.928721 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.928679 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-fqmt7_7f802959-aff2-42a4-8382-362ea3582d6a/volume-data-source-validator/0.log" Apr 22 17:18:29.940434 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:29.940377 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" podStartSLOduration=0.940359635 podStartE2EDuration="940.359635ms" podCreationTimestamp="2026-04-22 17:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:18:29.937685895 +0000 UTC m=+3409.639704004" watchObservedRunningTime="2026-04-22 17:18:29.940359635 +0000 UTC m=+3409.642377744" Apr 22 17:18:30.938832 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:30.938801 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8zkfn_e3238d38-c8d6-423c-bfa5-3feb9c21e8bc/dns/0.log" Apr 22 17:18:30.961319 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:30.961295 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8zkfn_e3238d38-c8d6-423c-bfa5-3feb9c21e8bc/kube-rbac-proxy/0.log" Apr 22 17:18:31.006534 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:31.006505 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jg6wx_91576000-c254-43f5-84ba-7029c347da22/dns-node-resolver/0.log" Apr 22 17:18:31.500805 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:31.500758 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-659c89dd9c-7zqf4_ada8f77b-6c93-4914-a78d-b753c44deb3e/registry/0.log" Apr 22 17:18:31.524188 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:31.524140 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xkxjl_a929b972-02c0-4e8a-b302-09406b1c441c/node-ca/0.log" Apr 22 17:18:32.349209 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:32.349174 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fzjdfn_4c0a40a6-c1b0-4337-a99a-e73afe30deb9/istio-proxy/0.log" Apr 22 17:18:32.626193 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:32.626113 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-zxbs5_ddc3ac5b-9a8f-444e-8119-aca65c5a6fcb/istio-proxy/0.log" Apr 22 17:18:32.648854 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:32.648822 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6db85cc586-wptmg_de8899ee-ffb9-447d-bfe6-3f4560af10d3/router/0.log" Apr 22 17:18:33.173187 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:33.173161 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-r9mm9_3ccac1e5-a013-4728-8544-cd8df005a479/serve-healthcheck-canary/0.log" Apr 22 17:18:33.624187 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:33.624143 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-hrwl2_47b372d5-7432-499c-b5f2-baff8c5f3689/insights-operator/0.log" Apr 22 17:18:33.625771 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:33.625746 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-hrwl2_47b372d5-7432-499c-b5f2-baff8c5f3689/insights-operator/1.log" Apr 22 17:18:33.649463 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:33.649420 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-grrr6_af39f682-9ae9-4abb-ad48-ca3370263b6b/kube-rbac-proxy/0.log" Apr 22 17:18:33.672898 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:33.672868 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-grrr6_af39f682-9ae9-4abb-ad48-ca3370263b6b/exporter/0.log" Apr 22 17:18:33.693690 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:33.693665 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-grrr6_af39f682-9ae9-4abb-ad48-ca3370263b6b/extractor/0.log" Apr 22 17:18:35.858730 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:35.858699 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-key-cleanup-29614620-c8p8d_4e209442-2c25-401c-834a-4f563e4b813b/cleanup/0.log" Apr 22 17:18:35.881800 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:35.881773 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-key-cleanup-29614635-c6w89_83e54559-d970-4b1f-b0b4-b6a2d4b314cd/cleanup/0.log" Apr 22 17:18:35.941471 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:35.941436 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-48tdc" Apr 22 17:18:35.953091 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:35.953066 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-66c9db867c-v9ngf_fd6ee9cb-5ab0-4dd6-9784-1bf2530b1e5a/manager/0.log" Apr 22 17:18:36.014657 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:36.014624 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-57c8d5d679-b8pdc_c3435da7-2b7c-47d4-b6f4-dc09140dd90e/manager/0.log" Apr 22 17:18:37.183641 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:37.183610 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7c5749599b-gjqlf_b489fa58-5f6c-4bbf-ba63-a73c1f64f28e/manager/0.log" Apr 22 17:18:41.899171 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:41.899138 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-jjjg5_b81523a0-77f7-4e1d-9f17-d99fd060b090/kube-storage-version-migrator-operator/1.log" Apr 22 17:18:41.899817 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:41.899797 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-jjjg5_b81523a0-77f7-4e1d-9f17-d99fd060b090/kube-storage-version-migrator-operator/0.log" Apr 22 17:18:42.839949 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:42.839916 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4gkbb_020a35c8-d40b-477c-8c6e-1530096b3f1a/kube-multus-additional-cni-plugins/0.log" Apr 22 17:18:42.860499 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:42.860464 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4gkbb_020a35c8-d40b-477c-8c6e-1530096b3f1a/egress-router-binary-copy/0.log" Apr 22 17:18:42.880314 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:42.880280 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4gkbb_020a35c8-d40b-477c-8c6e-1530096b3f1a/cni-plugins/0.log" Apr 22 17:18:42.899550 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:42.899526 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4gkbb_020a35c8-d40b-477c-8c6e-1530096b3f1a/bond-cni-plugin/0.log" Apr 22 17:18:42.920093 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:42.920071 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4gkbb_020a35c8-d40b-477c-8c6e-1530096b3f1a/routeoverride-cni/0.log" Apr 22 17:18:42.940265 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:42.940244 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4gkbb_020a35c8-d40b-477c-8c6e-1530096b3f1a/whereabouts-cni-bincopy/0.log" Apr 22 17:18:42.964853 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:42.964828 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4gkbb_020a35c8-d40b-477c-8c6e-1530096b3f1a/whereabouts-cni/0.log" Apr 22 17:18:43.298998 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:43.298970 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d5sxk_b508f921-8bf7-4ed5-858a-04f8cf475055/kube-multus/0.log" Apr 22 17:18:43.373385 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:43.373361 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5wqw7_09f37d35-30d1-4fc0-a88f-3514e6c16586/network-metrics-daemon/0.log" Apr 22 17:18:43.392170 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:43.392142 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5wqw7_09f37d35-30d1-4fc0-a88f-3514e6c16586/kube-rbac-proxy/0.log" Apr 22 17:18:44.864633 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:44.864602 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-controller/0.log" Apr 22 17:18:44.883955 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:44.883925 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/0.log" Apr 22 17:18:44.899210 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:44.899188 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovn-acl-logging/1.log" Apr 22 17:18:44.920578 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:44.920554 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/kube-rbac-proxy-node/0.log" Apr 22 17:18:44.951203 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:44.951178 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 17:18:44.972914 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:44.972889 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/northd/0.log" Apr 22 17:18:44.996190 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:44.996167 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/nbdb/0.log" Apr 22 17:18:45.018408 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:45.018382 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/sbdb/0.log" Apr 22 17:18:45.128147 ip-10-0-142-238 kubenswrapper[2575]: I0422 17:18:45.128078 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pxgm2_bbe5211a-cea9-4848-93bc-eaa0e38f906d/ovnkube-controller/0.log"