Apr 16 20:25:02.051806 ip-10-0-132-101 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 20:25:02.051817 ip-10-0-132-101 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 20:25:02.051825 ip-10-0-132-101 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 20:25:02.052131 ip-10-0-132-101 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 20:25:12.142292 ip-10-0-132-101 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 20:25:12.142308 ip-10-0-132-101 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot e6aa102acfdc4b23a0fe5cbe447b52c0 -- Apr 16 20:27:32.609766 ip-10-0-132-101 systemd[1]: Starting Kubernetes Kubelet... Apr 16 20:27:33.052357 ip-10-0-132-101 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:27:33.052357 ip-10-0-132-101 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 20:27:33.052357 ip-10-0-132-101 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:27:33.052357 ip-10-0-132-101 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 20:27:33.052357 ip-10-0-132-101 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:27:33.053155 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.053069 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 20:27:33.057283 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057268 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:27:33.057283 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057284 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057289 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057292 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057296 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057299 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057302 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057305 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057307 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057310 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057313 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057316 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057318 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057321 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057323 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057326 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057329 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057333 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057343 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057347 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:27:33.057350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057350 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057353 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057356 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057359 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057362 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057365 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057367 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057370 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057373 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057375 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057378 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057380 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057383 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057386 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057388 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057391 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057393 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057396 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057398 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057401 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:27:33.057811 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057404 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057406 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057410 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057413 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057415 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057418 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057420 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057423 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057425 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057427 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057430 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057432 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057435 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057437 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057441 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057444 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057447 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057449 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057452 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057455 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:27:33.058305 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057457 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057474 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057477 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057480 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057483 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057485 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057488 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057490 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057493 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057496 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057499 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057501 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057504 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057507 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057509 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057512 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057514 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057518 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057521 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057530 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:27:33.058804 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057533 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057537 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057541 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057543 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057546 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057549 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057909 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057914 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057917 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057920 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057923 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057926 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057929 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057931 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057934 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057937 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057939 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057942 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057945 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:27:33.059299 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057947 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057950 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057952 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057955 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057957 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057960 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057962 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057965 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057970 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057973 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057976 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057980 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057983 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057987 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057990 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057993 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057995 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.057998 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058001 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:27:33.059762 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058003 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058006 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058009 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058011 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058013 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058016 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058018 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058021 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058023 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058026 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058028 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058030 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058033 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058035 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058038 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058040 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058043 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058045 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058048 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:27:33.060224 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058050 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058053 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058055 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058058 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058061 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058064 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058068 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058070 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058073 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058075 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058078 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058081 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058085 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058087 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058090 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058093 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058096 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058098 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058101 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058103 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:27:33.060713 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058106 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058109 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058111 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058115 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058117 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058120 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058122 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058125 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058127 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058130 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058132 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058135 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058137 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058139 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.058142 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059564 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059583 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059593 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059598 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059602 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059606 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 20:27:33.061197 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059611 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059615 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059619 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059622 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059626 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059629 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059632 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059635 2577 flags.go:64] FLAG: --cgroup-root="" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059638 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059641 2577 flags.go:64] FLAG: --client-ca-file="" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059644 2577 flags.go:64] FLAG: --cloud-config="" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059647 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059651 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059655 2577 flags.go:64] FLAG: --cluster-domain="" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059658 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059662 2577 flags.go:64] FLAG: --config-dir="" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059665 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059668 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059672 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059676 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059679 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059683 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059686 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059689 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 20:27:33.061709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059692 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059696 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059699 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059704 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059709 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059712 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059715 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059718 2577 flags.go:64] FLAG: --enable-server="true" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059721 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059725 2577 flags.go:64] FLAG: --event-burst="100" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059728 2577 flags.go:64] FLAG: --event-qps="50" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059731 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059734 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059737 2577 flags.go:64] FLAG: --eviction-hard="" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059742 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059744 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059748 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059751 2577 flags.go:64] FLAG: --eviction-soft="" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059754 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059757 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059760 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059763 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059766 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059769 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059772 2577 flags.go:64] FLAG: --feature-gates="" Apr 16 20:27:33.062305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059776 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059779 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059782 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059786 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059789 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059792 2577 flags.go:64] FLAG: --help="false" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059795 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059798 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059801 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059804 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059808 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059811 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059815 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059818 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059821 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059824 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059827 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059830 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059833 2577 flags.go:64] FLAG: --kube-reserved="" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059836 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059838 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059841 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059844 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059847 2577 flags.go:64] FLAG: --lock-file="" Apr 16 20:27:33.062926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059850 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059853 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059856 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059861 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059864 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059867 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059870 2577 flags.go:64] FLAG: --logging-format="text" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059873 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059876 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059879 2577 flags.go:64] FLAG: --manifest-url="" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059882 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059887 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059890 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059894 2577 flags.go:64] FLAG: --max-pods="110" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059897 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059900 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059903 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059906 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059909 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059912 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059915 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059924 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059927 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059930 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 20:27:33.063520 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059933 2577 flags.go:64] FLAG: --pod-cidr="" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059936 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059942 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059945 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059948 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059951 2577 flags.go:64] FLAG: --port="10250" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059954 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059956 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02022c52124fbeec4" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059960 2577 flags.go:64] FLAG: --qos-reserved="" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059963 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059966 2577 flags.go:64] FLAG: --register-node="true" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059969 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059971 2577 flags.go:64] FLAG: --register-with-taints="" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059975 2577 flags.go:64] FLAG: --registry-burst="10" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059977 2577 flags.go:64] FLAG: --registry-qps="5" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059980 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059984 2577 flags.go:64] FLAG: --reserved-memory="" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059987 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059991 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059994 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059996 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.059999 2577 flags.go:64] FLAG: --runonce="false" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060002 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060005 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060009 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 16 20:27:33.064127 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060012 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060015 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060018 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060021 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060025 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060028 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060031 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060034 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060037 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060040 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060043 2577 flags.go:64] FLAG: --system-cgroups="" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060046 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060051 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060054 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060057 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060061 2577 flags.go:64] FLAG: --tls-min-version="" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060064 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060067 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060070 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060073 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060076 2577 flags.go:64] FLAG: --v="2" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060080 2577 flags.go:64] FLAG: --version="false" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060084 2577 flags.go:64] FLAG: --vmodule="" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060089 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.060092 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 20:27:33.064745 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060186 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060190 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060193 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060196 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060199 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060201 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060204 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060206 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060210 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060213 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060215 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060218 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060220 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060223 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060226 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060229 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060232 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060234 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060237 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:27:33.065330 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060240 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060243 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060245 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060247 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060250 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060253 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060256 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060258 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060260 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060263 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060265 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060268 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060270 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060273 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060276 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060279 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060281 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060284 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060286 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060288 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:27:33.065794 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060291 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060294 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060296 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060299 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060301 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060304 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060306 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060309 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060311 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060315 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060319 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060322 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060325 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060327 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060330 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060332 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060335 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060338 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060341 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060343 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:27:33.066308 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060345 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060348 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060350 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060353 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060355 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060362 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060365 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060369 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060372 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060375 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060378 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060380 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060383 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060385 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060388 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060390 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060393 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060395 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060398 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:27:33.067064 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060401 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:27:33.067923 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060403 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:27:33.067923 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060406 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:27:33.067923 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060409 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:27:33.067923 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060411 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:27:33.067923 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060414 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:27:33.067923 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060416 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:27:33.067923 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.060419 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:27:33.067923 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.061131 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:27:33.069221 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.069200 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 20:27:33.069306 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.069223 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 20:27:33.069306 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069289 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:27:33.069306 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069296 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:27:33.069306 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069301 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:27:33.069306 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069305 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069310 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069317 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069322 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069326 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069330 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069335 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069339 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069343 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069347 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069351 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069355 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069359 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069364 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069368 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069373 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069378 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069382 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069387 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:27:33.069534 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069391 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069395 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069399 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069403 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069408 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069412 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069417 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069421 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069425 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069441 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069446 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069453 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069478 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069484 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069489 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069495 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069500 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069504 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069509 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069514 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:27:33.070368 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069518 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069522 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069526 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069531 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069535 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069539 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069543 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069547 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069551 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069556 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069559 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069564 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069568 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069572 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069576 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069580 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069584 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069588 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069593 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069597 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:27:33.071018 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069601 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069605 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069610 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069614 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069619 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069623 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069627 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069632 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069637 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069641 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069645 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069649 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069653 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069658 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069662 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069666 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069670 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069675 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069679 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:27:33.071615 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069683 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:27:33.072403 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069687 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:27:33.072403 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069692 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:27:33.072403 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069695 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:27:33.072403 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069699 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:27:33.072403 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.069708 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:27:33.072403 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069865 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:27:33.072403 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069872 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:27:33.072403 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069877 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:27:33.072403 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069881 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:27:33.072403 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069885 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:27:33.072403 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069889 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:27:33.072403 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069894 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:27:33.072403 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069898 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:27:33.072403 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069901 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:27:33.072403 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069906 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069911 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069915 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069919 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069923 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069928 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069933 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069937 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069941 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069944 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069949 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069953 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069957 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069961 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069966 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069970 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069974 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069978 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069982 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069986 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:27:33.073056 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069990 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069994 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.069999 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070003 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070007 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070010 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070015 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070019 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070023 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070027 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070031 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070035 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070040 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070044 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070049 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070053 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070057 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070061 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070066 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070070 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:27:33.073602 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070074 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070078 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070082 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070087 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070091 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070095 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070099 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070103 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070107 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070111 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070115 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070121 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070128 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070132 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070137 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070142 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070146 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070150 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070155 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070160 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:27:33.074235 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070164 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:27:33.074787 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070168 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:27:33.074787 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070180 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:27:33.074787 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070185 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:27:33.074787 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070189 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:27:33.074787 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070194 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:27:33.074787 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070198 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:27:33.074787 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070203 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:27:33.074787 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070207 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:27:33.074787 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070212 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:27:33.074787 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070217 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:27:33.074787 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070221 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:27:33.074787 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070225 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:27:33.074787 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070229 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:27:33.074787 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070234 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:27:33.074787 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070240 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:27:33.074787 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:33.070244 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:27:33.075176 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.070252 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:27:33.075176 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.070997 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 20:27:33.075176 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.073660 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 20:27:33.075176 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.074683 2577 server.go:1019] "Starting client certificate rotation" Apr 16 20:27:33.075176 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.074781 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:27:33.075176 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.074819 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:27:33.097478 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.097438 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:27:33.105476 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.105445 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:27:33.122487 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.122453 2577 log.go:25] "Validated CRI v1 runtime API" Apr 16 20:27:33.128374 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.128359 2577 log.go:25] "Validated CRI v1 image API" Apr 16 20:27:33.129767 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.129751 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 20:27:33.131085 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.131070 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:27:33.135504 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.135482 2577 fs.go:135] Filesystem UUIDs: map[3e4054dc-88db-4f31-9cbb-04565effa096:/dev/nvme0n1p4 65a2fd1c-b602-40f0-9d95-8841480a2031:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 20:27:33.135563 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.135504 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 20:27:33.141656 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.141548 2577 manager.go:217] Machine: {Timestamp:2026-04-16 20:27:33.139502416 +0000 UTC m=+0.408422817 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096715 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2af7dbb8489397da3f800497fb71e9 SystemUUID:ec2af7db-b848-9397-da3f-800497fb71e9 BootID:e6aa102a-cfdc-4b23-a0fe-5cbe447b52c0 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:dd:f1:d7:48:c9 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:dd:f1:d7:48:c9 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:42:61:07:7a:b6:c1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 20:27:33.141656 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.141652 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 20:27:33.141766 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.141727 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 20:27:33.143150 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.143126 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 20:27:33.143319 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.143152 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-101.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 20:27:33.143362 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.143328 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 20:27:33.143362 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.143337 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 20:27:33.143362 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.143349 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:27:33.144203 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.144192 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:27:33.145484 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.145472 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:27:33.145588 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.145578 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 20:27:33.147952 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.147943 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 16 20:27:33.147984 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.147956 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 20:27:33.147984 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.147968 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 20:27:33.147984 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.147980 2577 kubelet.go:397] "Adding apiserver pod source" Apr 16 20:27:33.148097 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.147997 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 20:27:33.149204 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.149187 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:27:33.149292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.149208 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:27:33.149754 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.149738 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ml5qm" Apr 16 20:27:33.155250 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.155225 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 20:27:33.156628 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.156614 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 20:27:33.156892 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.156875 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ml5qm" Apr 16 20:27:33.158437 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.158423 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 20:27:33.158437 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.158439 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 20:27:33.158550 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.158446 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 20:27:33.158550 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.158452 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 20:27:33.158550 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.158457 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 20:27:33.158550 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.158478 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 20:27:33.158550 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.158484 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 20:27:33.158550 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.158490 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 20:27:33.158550 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.158497 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 20:27:33.158550 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.158503 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 20:27:33.158550 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.158511 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 20:27:33.158550 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.158520 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 20:27:33.159606 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.159595 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 20:27:33.159606 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.159606 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 20:27:33.162951 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.162938 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 20:27:33.163029 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.162970 2577 server.go:1295] "Started kubelet" Apr 16 20:27:33.163085 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.163060 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 20:27:33.163133 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.163059 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 20:27:33.163133 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.163127 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 20:27:33.163617 ip-10-0-132-101 systemd[1]: Started Kubernetes Kubelet. Apr 16 20:27:33.163765 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.163751 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:33.164874 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.164847 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 20:27:33.165034 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.165023 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 16 20:27:33.166708 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.166693 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:33.166776 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.166693 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-101.ec2.internal" not found Apr 16 20:27:33.170553 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.170536 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 20:27:33.170553 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.170545 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 20:27:33.171321 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.171238 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 20:27:33.171321 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.171313 2577 factory.go:55] Registering systemd factory Apr 16 20:27:33.171321 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.171323 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 20:27:33.171526 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.171334 2577 factory.go:223] Registration of the systemd container factory successfully Apr 16 20:27:33.171526 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.171336 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 20:27:33.171526 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.171410 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 16 20:27:33.171526 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.171418 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 16 20:27:33.171526 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:33.171450 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-101.ec2.internal\" not found" Apr 16 20:27:33.171925 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.171911 2577 factory.go:153] Registering CRI-O factory Apr 16 20:27:33.171999 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.171928 2577 factory.go:223] Registration of the crio container factory successfully Apr 16 20:27:33.171999 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.171976 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 20:27:33.172091 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.172001 2577 factory.go:103] Registering Raw factory Apr 16 20:27:33.172091 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.172017 2577 manager.go:1196] Started watching for new ooms in manager Apr 16 20:27:33.172451 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.172437 2577 manager.go:319] Starting recovery of all containers Apr 16 20:27:33.173316 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.173292 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:33.175434 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:33.175398 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 20:27:33.176225 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:33.176207 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-101.ec2.internal\" not found" node="ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.181794 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.181637 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-101.ec2.internal" not found Apr 16 20:27:33.184115 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.184087 2577 manager.go:324] Recovery completed Apr 16 20:27:33.188096 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.188084 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:27:33.189860 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.189847 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-101.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:27:33.189920 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.189872 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-101.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:27:33.189920 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.189882 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-101.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:27:33.190309 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.190296 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 20:27:33.190309 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.190307 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 20:27:33.190387 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.190323 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:27:33.193716 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.193705 2577 policy_none.go:49] "None policy: Start" Apr 16 20:27:33.193760 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.193720 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 20:27:33.193760 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.193729 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 16 20:27:33.234022 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.234007 2577 manager.go:341] "Starting Device Plugin manager" Apr 16 20:27:33.254809 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:33.234039 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 20:27:33.254809 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.234049 2577 server.go:85] "Starting device plugin registration server" Apr 16 20:27:33.254809 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.234276 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 20:27:33.254809 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.234286 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 20:27:33.254809 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.234371 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 20:27:33.254809 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.234443 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 20:27:33.254809 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.234448 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 20:27:33.254809 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:33.234980 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 20:27:33.254809 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:33.235012 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-101.ec2.internal\" not found" Apr 16 20:27:33.254809 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.239119 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-101.ec2.internal" not found Apr 16 20:27:33.277969 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.277937 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 20:27:33.279104 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.279086 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 20:27:33.279187 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.279119 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 20:27:33.279187 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.279139 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 20:27:33.279187 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.279149 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 20:27:33.279326 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:33.279187 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 20:27:33.281151 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.281132 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:33.334941 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.334886 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:27:33.335708 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.335693 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-101.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:27:33.335776 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.335721 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-101.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:27:33.335776 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.335735 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-101.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:27:33.335776 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.335768 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.344180 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.344166 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.379404 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.379382 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-101.ec2.internal"] Apr 16 20:27:33.382682 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.382666 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.382769 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.382672 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.401038 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.401019 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.404403 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.404388 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.419669 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.419650 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:27:33.419747 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.419654 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:27:33.572327 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.572300 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99f8167536275bbcb60afc6c9f189a44-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal\" (UID: \"99f8167536275bbcb60afc6c9f189a44\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.572459 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.572332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99f8167536275bbcb60afc6c9f189a44-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal\" (UID: \"99f8167536275bbcb60afc6c9f189a44\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.572459 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.572351 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5e83c939a004f23d83533d2ed83f17f1-config\") pod \"kube-apiserver-proxy-ip-10-0-132-101.ec2.internal\" (UID: \"5e83c939a004f23d83533d2ed83f17f1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.672526 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.672452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99f8167536275bbcb60afc6c9f189a44-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal\" (UID: \"99f8167536275bbcb60afc6c9f189a44\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.672526 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.672497 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99f8167536275bbcb60afc6c9f189a44-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal\" (UID: \"99f8167536275bbcb60afc6c9f189a44\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.672526 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.672516 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5e83c939a004f23d83533d2ed83f17f1-config\") pod \"kube-apiserver-proxy-ip-10-0-132-101.ec2.internal\" (UID: \"5e83c939a004f23d83533d2ed83f17f1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.672703 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.672550 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99f8167536275bbcb60afc6c9f189a44-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal\" (UID: \"99f8167536275bbcb60afc6c9f189a44\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.672703 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.672565 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99f8167536275bbcb60afc6c9f189a44-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal\" (UID: \"99f8167536275bbcb60afc6c9f189a44\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.672703 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.672569 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5e83c939a004f23d83533d2ed83f17f1-config\") pod \"kube-apiserver-proxy-ip-10-0-132-101.ec2.internal\" (UID: \"5e83c939a004f23d83533d2ed83f17f1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.722605 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.722583 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal" Apr 16 20:27:33.722605 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:33.722601 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-101.ec2.internal" Apr 16 20:27:34.074342 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.074263 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 20:27:34.074976 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.074437 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:27:34.074976 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.074437 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:27:34.074976 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.074437 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:27:34.148606 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.148579 2577 apiserver.go:52] "Watching apiserver" Apr 16 20:27:34.155758 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.155736 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 20:27:34.158655 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.158633 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 20:22:33 +0000 UTC" deadline="2027-11-30 07:27:33.348496848 +0000 UTC" Apr 16 20:27:34.158655 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.158654 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14218h59m59.189845188s" Apr 16 20:27:34.159998 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.159981 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-m848r","openshift-multus/multus-264w9","openshift-multus/multus-additional-cni-plugins-vvztc","openshift-network-diagnostics/network-check-target-22jqt","openshift-cluster-node-tuning-operator/tuned-dxcnl","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal","openshift-multus/network-metrics-daemon-b8p9v","openshift-network-operator/iptables-alerter-gpdwg","openshift-ovn-kubernetes/ovnkube-node-5gm4x","kube-system/konnectivity-agent-9tzkv","kube-system/kube-apiserver-proxy-ip-10-0-132-101.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g","openshift-dns/node-resolver-jk4m8"] Apr 16 20:27:34.165658 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.165640 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-m848r" Apr 16 20:27:34.168268 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.168240 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-264w9" Apr 16 20:27:34.169640 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.169618 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 20:27:34.169717 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.169618 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 20:27:34.169870 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.169855 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6lc9h\"" Apr 16 20:27:34.170139 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.170126 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.170238 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.170222 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:34.170314 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:34.170290 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-22jqt" podUID="fc79cedd-f56e-4d89-bb14-fc539e3148fb" Apr 16 20:27:34.170392 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.170374 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 20:27:34.170670 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.170657 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 20:27:34.171332 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.171315 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-54nkv\"" Apr 16 20:27:34.171421 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.171355 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 20:27:34.171421 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.171370 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 20:27:34.171421 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.171392 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 20:27:34.171421 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.171355 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 20:27:34.172188 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.172163 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 20:27:34.172308 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.172210 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.172432 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.172397 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 20:27:34.172586 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.172571 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-t4m24\"" Apr 16 20:27:34.174198 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174183 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:27:34.174339 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174322 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:34.174405 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174346 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-sys\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.174405 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174377 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffb263cb-6f76-4dfe-a02a-435624b83457-os-release\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.174405 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:34.174385 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b8p9v" podUID="30274609-546d-4c7b-abd0-8907fd0a6cd7" Apr 16 20:27:34.174542 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174403 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-modprobe-d\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.174542 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174430 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-4hfkv\"" Apr 16 20:27:34.174542 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174441 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-systemd\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.174542 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174482 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 20:27:34.174542 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174511 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz42m\" (UniqueName: \"kubernetes.io/projected/8df02eba-eb01-4603-87bd-76a281217485-kube-api-access-wz42m\") pod \"node-ca-m848r\" (UID: \"8df02eba-eb01-4603-87bd-76a281217485\") " pod="openshift-image-registry/node-ca-m848r" Apr 16 20:27:34.174793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174547 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffb263cb-6f76-4dfe-a02a-435624b83457-cnibin\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.174793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174572 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkh4x\" (UniqueName: \"kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x\") pod \"network-check-target-22jqt\" (UID: \"fc79cedd-f56e-4d89-bb14-fc539e3148fb\") " pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:34.174793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174587 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-run\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.174793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174605 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-run-netns\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.174793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174622 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-hostroot\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.174793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174657 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ffb263cb-6f76-4dfe-a02a-435624b83457-cni-binary-copy\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.174793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174686 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ffb263cb-6f76-4dfe-a02a-435624b83457-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.174793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174716 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ffb263cb-6f76-4dfe-a02a-435624b83457-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.174793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174743 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-sysctl-d\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.174793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174766 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-tmp\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.174793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-system-cni-dir\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.175292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174817 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-multus-socket-dir-parent\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.175292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174842 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-var-lib-cni-bin\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.175292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174866 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-var-lib-cni-multus\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.175292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174889 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-multus-conf-dir\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.175292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174912 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-etc-kubernetes\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.175292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174936 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-sysconfig\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.175292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174958 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8df02eba-eb01-4603-87bd-76a281217485-serviceca\") pod \"node-ca-m848r\" (UID: \"8df02eba-eb01-4603-87bd-76a281217485\") " pod="openshift-image-registry/node-ca-m848r" Apr 16 20:27:34.175292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.174984 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-var-lib-kubelet\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.175292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175006 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a8845fea-ade3-4e74-b157-294175ce8b1a-multus-daemon-config\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.175292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175030 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmlzw\" (UniqueName: \"kubernetes.io/projected/a8845fea-ade3-4e74-b157-294175ce8b1a-kube-api-access-nmlzw\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.175292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-kubernetes\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.175292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175077 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-sysctl-conf\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.175292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175099 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-host\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.175292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175127 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-tuned\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.175292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175150 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8df02eba-eb01-4603-87bd-76a281217485-host\") pod \"node-ca-m848r\" (UID: \"8df02eba-eb01-4603-87bd-76a281217485\") " pod="openshift-image-registry/node-ca-m848r" Apr 16 20:27:34.175292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-run-multus-certs\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.175292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175197 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffb263cb-6f76-4dfe-a02a-435624b83457-system-cni-dir\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.175918 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175219 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-var-lib-kubelet\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.175918 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175258 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-cnibin\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.175918 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175285 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-os-release\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.175918 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175323 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a8845fea-ade3-4e74-b157-294175ce8b1a-cni-binary-copy\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.175918 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175360 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-run-k8s-cni-cncf-io\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.175918 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175389 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffb263cb-6f76-4dfe-a02a-435624b83457-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.175918 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175417 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-lib-modules\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.175918 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175442 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg72m\" (UniqueName: \"kubernetes.io/projected/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-kube-api-access-sg72m\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.175918 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175490 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-multus-cni-dir\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.175918 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.175516 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmq2q\" (UniqueName: \"kubernetes.io/projected/ffb263cb-6f76-4dfe-a02a-435624b83457-kube-api-access-lmq2q\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.176272 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.176258 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gpdwg" Apr 16 20:27:34.178488 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.178456 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.178581 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.178564 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:27:34.178645 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.178595 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-wqkxr\"" Apr 16 20:27:34.178702 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.178666 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 20:27:34.180563 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.180544 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9tzkv" Apr 16 20:27:34.180753 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.180718 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 20:27:34.180990 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.180966 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 20:27:34.181111 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.181083 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 20:27:34.181154 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.181132 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-q49nx\"" Apr 16 20:27:34.181154 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.181140 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 20:27:34.182103 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.182057 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 20:27:34.182243 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.182121 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 20:27:34.182904 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.182891 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 20:27:34.182990 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.182895 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.184744 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.183589 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fm5nh\"" Apr 16 20:27:34.184744 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.183666 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 20:27:34.184744 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.184280 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:27:34.184744 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.184564 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 20:27:34.185835 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.185499 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jk4m8" Apr 16 20:27:34.185835 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.185563 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 20:27:34.186657 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.186630 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7ncss\"" Apr 16 20:27:34.186733 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.186656 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 20:27:34.186789 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.186639 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 20:27:34.187950 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.187923 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 20:27:34.188054 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.188031 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 20:27:34.188291 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.188276 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mxqm9\"" Apr 16 20:27:34.204439 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.204422 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vw8bm" Apr 16 20:27:34.211669 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.211647 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vw8bm" Apr 16 20:27:34.272199 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.272178 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 20:27:34.275807 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.275785 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-cnibin\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.275903 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.275813 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-os-release\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.275903 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.275840 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a8845fea-ade3-4e74-b157-294175ce8b1a-cni-binary-copy\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.275903 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.275870 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-lib-modules\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.276047 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.275898 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-cnibin\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.276047 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.275901 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/41119468-3774-48bd-98be-d49ab3625162-hosts-file\") pod \"node-resolver-jk4m8\" (UID: \"41119468-3774-48bd-98be-d49ab3625162\") " pod="openshift-dns/node-resolver-jk4m8" Apr 16 20:27:34.276047 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.275918 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-os-release\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.276204 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276065 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs\") pod \"network-metrics-daemon-b8p9v\" (UID: \"30274609-546d-4c7b-abd0-8907fd0a6cd7\") " pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:34.276204 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276113 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-registration-dir\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.276204 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276117 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-lib-modules\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.276204 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-etc-selinux\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.276204 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276169 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-multus-cni-dir\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.276204 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276191 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl5dp\" (UniqueName: \"kubernetes.io/projected/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-kube-api-access-fl5dp\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.276516 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276227 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-slash\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.276516 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-multus-cni-dir\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.276516 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276254 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffb263cb-6f76-4dfe-a02a-435624b83457-os-release\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.276516 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276277 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-modprobe-d\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.276516 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276294 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-systemd\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.276516 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276344 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-systemd\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.276516 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276348 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a8845fea-ade3-4e74-b157-294175ce8b1a-cni-binary-copy\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.276516 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276343 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffb263cb-6f76-4dfe-a02a-435624b83457-os-release\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.276516 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276378 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcxr5\" (UniqueName: \"kubernetes.io/projected/30274609-546d-4c7b-abd0-8907fd0a6cd7-kube-api-access-pcxr5\") pod \"network-metrics-daemon-b8p9v\" (UID: \"30274609-546d-4c7b-abd0-8907fd0a6cd7\") " pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:34.276516 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276389 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-modprobe-d\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.276516 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276402 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-ovnkube-script-lib\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.276516 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276446 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wz42m\" (UniqueName: \"kubernetes.io/projected/8df02eba-eb01-4603-87bd-76a281217485-kube-api-access-wz42m\") pod \"node-ca-m848r\" (UID: \"8df02eba-eb01-4603-87bd-76a281217485\") " pod="openshift-image-registry/node-ca-m848r" Apr 16 20:27:34.276516 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276488 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffb263cb-6f76-4dfe-a02a-435624b83457-cnibin\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.277120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276547 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkh4x\" (UniqueName: \"kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x\") pod \"network-check-target-22jqt\" (UID: \"fc79cedd-f56e-4d89-bb14-fc539e3148fb\") " pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:34.277120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276572 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-run\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.277120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276601 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjn49\" (UniqueName: \"kubernetes.io/projected/0df945b0-6094-4a50-aa50-80559da5aea1-kube-api-access-vjn49\") pod \"iptables-alerter-gpdwg\" (UID: \"0df945b0-6094-4a50-aa50-80559da5aea1\") " pod="openshift-network-operator/iptables-alerter-gpdwg" Apr 16 20:27:34.277120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276634 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-kubelet\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.277120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276659 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-systemd-units\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.277120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276682 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-etc-openvswitch\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.277120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276712 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ffb263cb-6f76-4dfe-a02a-435624b83457-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.277120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276737 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-node-log\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.277120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276754 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffb263cb-6f76-4dfe-a02a-435624b83457-cnibin\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.277120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276758 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-ovn-node-metrics-cert\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.277120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276785 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg72m\" (UniqueName: \"kubernetes.io/projected/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-kube-api-access-sg72m\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.277120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276813 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-system-cni-dir\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.277120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276838 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-var-lib-cni-multus\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.277120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276861 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-multus-conf-dir\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.277120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276885 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-etc-kubernetes\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.277120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276925 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-sysctl-conf\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.277120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276951 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-multus-conf-dir\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.277940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276959 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-kubelet-dir\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.277940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276968 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-system-cni-dir\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.277940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8df02eba-eb01-4603-87bd-76a281217485-serviceca\") pod \"node-ca-m848r\" (UID: \"8df02eba-eb01-4603-87bd-76a281217485\") " pod="openshift-image-registry/node-ca-m848r" Apr 16 20:27:34.277940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.276989 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-var-lib-cni-multus\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.277940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277016 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a8845fea-ade3-4e74-b157-294175ce8b1a-multus-daemon-config\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.277940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmlzw\" (UniqueName: \"kubernetes.io/projected/a8845fea-ade3-4e74-b157-294175ce8b1a-kube-api-access-nmlzw\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.277940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-run-ovn-kubernetes\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.277940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277100 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-cni-netd\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.277940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-var-lib-kubelet\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.277940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277134 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-sysctl-conf\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.277940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8211bddd-6e75-435d-865a-caf384cfefad-konnectivity-ca\") pod \"konnectivity-agent-9tzkv\" (UID: \"8211bddd-6e75-435d-865a-caf384cfefad\") " pod="kube-system/konnectivity-agent-9tzkv" Apr 16 20:27:34.277940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277176 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-run-systemd\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.277940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277019 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-run\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.277940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277200 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-log-socket\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.277940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277249 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ffb263cb-6f76-4dfe-a02a-435624b83457-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.277940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277242 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-env-overrides\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.277940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277304 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-run-k8s-cni-cncf-io\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.278732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277335 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-var-lib-kubelet\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.278732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277330 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffb263cb-6f76-4dfe-a02a-435624b83457-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.278732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8211bddd-6e75-435d-865a-caf384cfefad-agent-certs\") pod \"konnectivity-agent-9tzkv\" (UID: \"8211bddd-6e75-435d-865a-caf384cfefad\") " pod="kube-system/konnectivity-agent-9tzkv" Apr 16 20:27:34.278732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277400 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-run-ovn\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.278732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-run-k8s-cni-cncf-io\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.278732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277411 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8df02eba-eb01-4603-87bd-76a281217485-serviceca\") pod \"node-ca-m848r\" (UID: \"8df02eba-eb01-4603-87bd-76a281217485\") " pod="openshift-image-registry/node-ca-m848r" Apr 16 20:27:34.278732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277455 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a8845fea-ade3-4e74-b157-294175ce8b1a-multus-daemon-config\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.278732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277455 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-cni-bin\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.278732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmq2q\" (UniqueName: \"kubernetes.io/projected/ffb263cb-6f76-4dfe-a02a-435624b83457-kube-api-access-lmq2q\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.278732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277543 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffb263cb-6f76-4dfe-a02a-435624b83457-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.278732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277554 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-sys\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.278732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277586 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-etc-kubernetes\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.278732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277599 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-sys\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.278732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277605 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-sys-fs\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.278732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277641 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0df945b0-6094-4a50-aa50-80559da5aea1-host-slash\") pod \"iptables-alerter-gpdwg\" (UID: \"0df945b0-6094-4a50-aa50-80559da5aea1\") " pod="openshift-network-operator/iptables-alerter-gpdwg" Apr 16 20:27:34.278732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277668 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvdxv\" (UniqueName: \"kubernetes.io/projected/41119468-3774-48bd-98be-d49ab3625162-kube-api-access-pvdxv\") pod \"node-resolver-jk4m8\" (UID: \"41119468-3774-48bd-98be-d49ab3625162\") " pod="openshift-dns/node-resolver-jk4m8" Apr 16 20:27:34.278732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277694 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-device-dir\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.279380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277773 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-run-netns\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.279380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277818 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-var-lib-openvswitch\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.279380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277846 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-run-openvswitch\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.279380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277883 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-run-netns\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.279380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277923 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-run-netns\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.279380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277944 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-hostroot\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.279380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ffb263cb-6f76-4dfe-a02a-435624b83457-cni-binary-copy\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.279380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.277999 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ffb263cb-6f76-4dfe-a02a-435624b83457-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.279380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278040 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-sysctl-d\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.279380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-tmp\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.279380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278092 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0df945b0-6094-4a50-aa50-80559da5aea1-iptables-alerter-script\") pod \"iptables-alerter-gpdwg\" (UID: \"0df945b0-6094-4a50-aa50-80559da5aea1\") " pod="openshift-network-operator/iptables-alerter-gpdwg" Apr 16 20:27:34.279380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278037 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-hostroot\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.279380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278120 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-multus-socket-dir-parent\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.279380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278136 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-sysctl-d\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.279380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278148 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-var-lib-cni-bin\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.279380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278189 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-sysconfig\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.279380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278211 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-host\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.280225 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278217 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-multus-socket-dir-parent\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.280225 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278235 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-tuned\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.280225 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278263 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-ovnkube-config\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.280225 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278267 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-sysconfig\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.280225 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278311 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-var-lib-cni-bin\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.280225 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278360 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-var-lib-kubelet\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.280225 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278371 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-host\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.280225 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278391 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-kubernetes\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.280225 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278401 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 20:27:34.280225 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278422 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/41119468-3774-48bd-98be-d49ab3625162-tmp-dir\") pod \"node-resolver-jk4m8\" (UID: \"41119468-3774-48bd-98be-d49ab3625162\") " pod="openshift-dns/node-resolver-jk4m8" Apr 16 20:27:34.280225 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278483 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-socket-dir\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.280225 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp7s7\" (UniqueName: \"kubernetes.io/projected/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-kube-api-access-rp7s7\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.280225 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278535 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8df02eba-eb01-4603-87bd-76a281217485-host\") pod \"node-ca-m848r\" (UID: \"8df02eba-eb01-4603-87bd-76a281217485\") " pod="openshift-image-registry/node-ca-m848r" Apr 16 20:27:34.280225 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278582 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ffb263cb-6f76-4dfe-a02a-435624b83457-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.280225 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278559 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-run-multus-certs\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.280225 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278624 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffb263cb-6f76-4dfe-a02a-435624b83457-system-cni-dir\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.280225 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278651 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.280848 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278664 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-var-lib-kubelet\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.280848 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278743 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8df02eba-eb01-4603-87bd-76a281217485-host\") pod \"node-ca-m848r\" (UID: \"8df02eba-eb01-4603-87bd-76a281217485\") " pod="openshift-image-registry/node-ca-m848r" Apr 16 20:27:34.280848 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278743 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffb263cb-6f76-4dfe-a02a-435624b83457-system-cni-dir\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.280848 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278772 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ffb263cb-6f76-4dfe-a02a-435624b83457-cni-binary-copy\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.280848 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278803 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a8845fea-ade3-4e74-b157-294175ce8b1a-host-run-multus-certs\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.280848 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.278815 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-kubernetes\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.281861 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.281844 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-tmp\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.282279 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.282259 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-etc-tuned\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.285426 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:34.285402 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:34.285525 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:34.285431 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:34.285525 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:34.285445 2577 projected.go:194] Error preparing data for projected volume kube-api-access-wkh4x for pod openshift-network-diagnostics/network-check-target-22jqt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:34.285641 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:34.285569 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x podName:fc79cedd-f56e-4d89-bb14-fc539e3148fb nodeName:}" failed. No retries permitted until 2026-04-16 20:27:34.785541956 +0000 UTC m=+2.054462360 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wkh4x" (UniqueName: "kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x") pod "network-check-target-22jqt" (UID: "fc79cedd-f56e-4d89-bb14-fc539e3148fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:34.286244 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.286225 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg72m\" (UniqueName: \"kubernetes.io/projected/c8ba1b5d-f5a6-46f9-9bb6-366595e5970c-kube-api-access-sg72m\") pod \"tuned-dxcnl\" (UID: \"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c\") " pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.286244 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.286234 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz42m\" (UniqueName: \"kubernetes.io/projected/8df02eba-eb01-4603-87bd-76a281217485-kube-api-access-wz42m\") pod \"node-ca-m848r\" (UID: \"8df02eba-eb01-4603-87bd-76a281217485\") " pod="openshift-image-registry/node-ca-m848r" Apr 16 20:27:34.286583 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.286557 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmq2q\" (UniqueName: \"kubernetes.io/projected/ffb263cb-6f76-4dfe-a02a-435624b83457-kube-api-access-lmq2q\") pod \"multus-additional-cni-plugins-vvztc\" (UID: \"ffb263cb-6f76-4dfe-a02a-435624b83457\") " pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.286846 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.286829 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmlzw\" (UniqueName: \"kubernetes.io/projected/a8845fea-ade3-4e74-b157-294175ce8b1a-kube-api-access-nmlzw\") pod \"multus-264w9\" (UID: \"a8845fea-ade3-4e74-b157-294175ce8b1a\") " pod="openshift-multus/multus-264w9" Apr 16 20:27:34.291320 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:34.291298 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e83c939a004f23d83533d2ed83f17f1.slice/crio-9e8bb6674f34b93c6416d9eb106a577531d275b464e4bdfb32d9293a7742d6e9 WatchSource:0}: Error finding container 9e8bb6674f34b93c6416d9eb106a577531d275b464e4bdfb32d9293a7742d6e9: Status 404 returned error can't find the container with id 9e8bb6674f34b93c6416d9eb106a577531d275b464e4bdfb32d9293a7742d6e9 Apr 16 20:27:34.302566 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.302543 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:27:34.308230 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:34.308212 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99f8167536275bbcb60afc6c9f189a44.slice/crio-85bdaa0eb7cdd3edbc347843792143c26cc9968792ac814e2ffaa1f2406a760f WatchSource:0}: Error finding container 85bdaa0eb7cdd3edbc347843792143c26cc9968792ac814e2ffaa1f2406a760f: Status 404 returned error can't find the container with id 85bdaa0eb7cdd3edbc347843792143c26cc9968792ac814e2ffaa1f2406a760f Apr 16 20:27:34.379952 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.379863 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-kubelet-dir\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.379952 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.379908 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-run-ovn-kubernetes\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.379952 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.379932 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-cni-netd\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.379952 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.379955 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8211bddd-6e75-435d-865a-caf384cfefad-konnectivity-ca\") pod \"konnectivity-agent-9tzkv\" (UID: \"8211bddd-6e75-435d-865a-caf384cfefad\") " pod="kube-system/konnectivity-agent-9tzkv" Apr 16 20:27:34.380182 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.379981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-run-systemd\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380182 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.379983 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-kubelet-dir\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.380182 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.379988 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-run-ovn-kubernetes\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380182 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.379987 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-cni-netd\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380182 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380005 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-log-socket\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380182 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-env-overrides\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380182 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8211bddd-6e75-435d-865a-caf384cfefad-agent-certs\") pod \"konnectivity-agent-9tzkv\" (UID: \"8211bddd-6e75-435d-865a-caf384cfefad\") " pod="kube-system/konnectivity-agent-9tzkv" Apr 16 20:27:34.380182 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-run-ovn\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380182 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380088 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-run-systemd\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380182 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380112 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-run-ovn\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380182 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380129 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-log-socket\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380182 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-cni-bin\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380182 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380150 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-cni-bin\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380182 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380170 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-sys-fs\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.380812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0df945b0-6094-4a50-aa50-80559da5aea1-host-slash\") pod \"iptables-alerter-gpdwg\" (UID: \"0df945b0-6094-4a50-aa50-80559da5aea1\") " pod="openshift-network-operator/iptables-alerter-gpdwg" Apr 16 20:27:34.380812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvdxv\" (UniqueName: \"kubernetes.io/projected/41119468-3774-48bd-98be-d49ab3625162-kube-api-access-pvdxv\") pod \"node-resolver-jk4m8\" (UID: \"41119468-3774-48bd-98be-d49ab3625162\") " pod="openshift-dns/node-resolver-jk4m8" Apr 16 20:27:34.380812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380240 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-device-dir\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.380812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380272 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-run-netns\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380292 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-sys-fs\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.380812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380297 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-var-lib-openvswitch\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380307 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0df945b0-6094-4a50-aa50-80559da5aea1-host-slash\") pod \"iptables-alerter-gpdwg\" (UID: \"0df945b0-6094-4a50-aa50-80559da5aea1\") " pod="openshift-network-operator/iptables-alerter-gpdwg" Apr 16 20:27:34.380812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380319 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-run-openvswitch\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380324 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-device-dir\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.380812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380349 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-run-netns\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380380 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0df945b0-6094-4a50-aa50-80559da5aea1-iptables-alerter-script\") pod \"iptables-alerter-gpdwg\" (UID: \"0df945b0-6094-4a50-aa50-80559da5aea1\") " pod="openshift-network-operator/iptables-alerter-gpdwg" Apr 16 20:27:34.380812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380388 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-var-lib-openvswitch\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380398 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-run-openvswitch\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380417 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-ovnkube-config\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.380812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380446 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/41119468-3774-48bd-98be-d49ab3625162-tmp-dir\") pod \"node-resolver-jk4m8\" (UID: \"41119468-3774-48bd-98be-d49ab3625162\") " pod="openshift-dns/node-resolver-jk4m8" Apr 16 20:27:34.380812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-socket-dir\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.380812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rp7s7\" (UniqueName: \"kubernetes.io/projected/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-kube-api-access-rp7s7\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.381573 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380547 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.381573 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380581 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/41119468-3774-48bd-98be-d49ab3625162-hosts-file\") pod \"node-resolver-jk4m8\" (UID: \"41119468-3774-48bd-98be-d49ab3625162\") " pod="openshift-dns/node-resolver-jk4m8" Apr 16 20:27:34.381573 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380642 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs\") pod \"network-metrics-daemon-b8p9v\" (UID: \"30274609-546d-4c7b-abd0-8907fd0a6cd7\") " pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:34.381573 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380669 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-registration-dir\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.381573 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380693 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-etc-selinux\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.381573 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380720 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fl5dp\" (UniqueName: \"kubernetes.io/projected/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-kube-api-access-fl5dp\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.381573 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380745 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-slash\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.381573 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380775 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcxr5\" (UniqueName: \"kubernetes.io/projected/30274609-546d-4c7b-abd0-8907fd0a6cd7-kube-api-access-pcxr5\") pod \"network-metrics-daemon-b8p9v\" (UID: \"30274609-546d-4c7b-abd0-8907fd0a6cd7\") " pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:34.381573 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-ovnkube-script-lib\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.381573 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380836 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-socket-dir\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.381573 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380844 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjn49\" (UniqueName: \"kubernetes.io/projected/0df945b0-6094-4a50-aa50-80559da5aea1-kube-api-access-vjn49\") pod \"iptables-alerter-gpdwg\" (UID: \"0df945b0-6094-4a50-aa50-80559da5aea1\") " pod="openshift-network-operator/iptables-alerter-gpdwg" Apr 16 20:27:34.381573 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380846 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/41119468-3774-48bd-98be-d49ab3625162-tmp-dir\") pod \"node-resolver-jk4m8\" (UID: \"41119468-3774-48bd-98be-d49ab3625162\") " pod="openshift-dns/node-resolver-jk4m8" Apr 16 20:27:34.381573 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380880 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-kubelet\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.381573 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380906 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-systemd-units\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.381573 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380919 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.381573 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380930 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-etc-openvswitch\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.381573 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380923 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0df945b0-6094-4a50-aa50-80559da5aea1-iptables-alerter-script\") pod \"iptables-alerter-gpdwg\" (UID: \"0df945b0-6094-4a50-aa50-80559da5aea1\") " pod="openshift-network-operator/iptables-alerter-gpdwg" Apr 16 20:27:34.382108 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380959 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-node-log\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.382108 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380963 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-kubelet\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.382108 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380983 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-ovn-node-metrics-cert\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.382108 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.380992 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/41119468-3774-48bd-98be-d49ab3625162-hosts-file\") pod \"node-resolver-jk4m8\" (UID: \"41119468-3774-48bd-98be-d49ab3625162\") " pod="openshift-dns/node-resolver-jk4m8" Apr 16 20:27:34.382108 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.381016 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-etc-openvswitch\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.382108 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.381062 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-systemd-units\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.382108 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.381080 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-env-overrides\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.382108 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:34.381083 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:34.382108 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.381123 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-registration-dir\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.382108 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.381154 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-etc-selinux\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.382108 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:34.381163 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs podName:30274609-546d-4c7b-abd0-8907fd0a6cd7 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:34.881136226 +0000 UTC m=+2.150056814 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs") pod "network-metrics-daemon-b8p9v" (UID: "30274609-546d-4c7b-abd0-8907fd0a6cd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:34.382108 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.381172 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-node-log\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.382108 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.381227 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-ovnkube-config\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.382108 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.381262 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-host-slash\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.382108 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.381593 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8211bddd-6e75-435d-865a-caf384cfefad-konnectivity-ca\") pod \"konnectivity-agent-9tzkv\" (UID: \"8211bddd-6e75-435d-865a-caf384cfefad\") " pod="kube-system/konnectivity-agent-9tzkv" Apr 16 20:27:34.382108 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.381605 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-ovnkube-script-lib\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.382585 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.382538 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8211bddd-6e75-435d-865a-caf384cfefad-agent-certs\") pod \"konnectivity-agent-9tzkv\" (UID: \"8211bddd-6e75-435d-865a-caf384cfefad\") " pod="kube-system/konnectivity-agent-9tzkv" Apr 16 20:27:34.383055 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.383039 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-ovn-node-metrics-cert\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.389819 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.389794 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl5dp\" (UniqueName: \"kubernetes.io/projected/989b9e2e-3c3c-4bd4-a883-e8af0168b1fa-kube-api-access-fl5dp\") pod \"aws-ebs-csi-driver-node-shf8g\" (UID: \"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.389981 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.389961 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcxr5\" (UniqueName: \"kubernetes.io/projected/30274609-546d-4c7b-abd0-8907fd0a6cd7-kube-api-access-pcxr5\") pod \"network-metrics-daemon-b8p9v\" (UID: \"30274609-546d-4c7b-abd0-8907fd0a6cd7\") " pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:34.390103 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.390088 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjn49\" (UniqueName: \"kubernetes.io/projected/0df945b0-6094-4a50-aa50-80559da5aea1-kube-api-access-vjn49\") pod \"iptables-alerter-gpdwg\" (UID: \"0df945b0-6094-4a50-aa50-80559da5aea1\") " pod="openshift-network-operator/iptables-alerter-gpdwg" Apr 16 20:27:34.390423 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.390408 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp7s7\" (UniqueName: \"kubernetes.io/projected/2a1c475f-d8da-4de3-9d2f-33da4c16e0fa-kube-api-access-rp7s7\") pod \"ovnkube-node-5gm4x\" (UID: \"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.390530 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.390516 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvdxv\" (UniqueName: \"kubernetes.io/projected/41119468-3774-48bd-98be-d49ab3625162-kube-api-access-pvdxv\") pod \"node-resolver-jk4m8\" (UID: \"41119468-3774-48bd-98be-d49ab3625162\") " pod="openshift-dns/node-resolver-jk4m8" Apr 16 20:27:34.497692 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.497649 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-m848r" Apr 16 20:27:34.503642 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:34.503618 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8df02eba_eb01_4603_87bd_76a281217485.slice/crio-5fc5b1e190fbbff825f08f6a22bfdfe8de8f275ca25cfb8b38c9eda5380953b1 WatchSource:0}: Error finding container 5fc5b1e190fbbff825f08f6a22bfdfe8de8f275ca25cfb8b38c9eda5380953b1: Status 404 returned error can't find the container with id 5fc5b1e190fbbff825f08f6a22bfdfe8de8f275ca25cfb8b38c9eda5380953b1 Apr 16 20:27:34.518268 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.518249 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-264w9" Apr 16 20:27:34.522188 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.522162 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vvztc" Apr 16 20:27:34.524438 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:34.524419 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8845fea_ade3_4e74_b157_294175ce8b1a.slice/crio-8085cc03eb56e6b7b2f2e373254ceea202aa5436ced7c776cc6cbd989c44aacc WatchSource:0}: Error finding container 8085cc03eb56e6b7b2f2e373254ceea202aa5436ced7c776cc6cbd989c44aacc: Status 404 returned error can't find the container with id 8085cc03eb56e6b7b2f2e373254ceea202aa5436ced7c776cc6cbd989c44aacc Apr 16 20:27:34.529251 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:34.529230 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffb263cb_6f76_4dfe_a02a_435624b83457.slice/crio-0bc1eb22c036f53a0edf7418b6a5b440f96413a05b96cc08acbc3b29f8343fce WatchSource:0}: Error finding container 0bc1eb22c036f53a0edf7418b6a5b440f96413a05b96cc08acbc3b29f8343fce: Status 404 returned error can't find the container with id 0bc1eb22c036f53a0edf7418b6a5b440f96413a05b96cc08acbc3b29f8343fce Apr 16 20:27:34.542919 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.542896 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" Apr 16 20:27:34.548442 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:34.548421 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8ba1b5d_f5a6_46f9_9bb6_366595e5970c.slice/crio-ce4b832d46c2603e7e1627426ddc3b9b88ce760ff5ba7b6804a9377dc62c10f2 WatchSource:0}: Error finding container ce4b832d46c2603e7e1627426ddc3b9b88ce760ff5ba7b6804a9377dc62c10f2: Status 404 returned error can't find the container with id ce4b832d46c2603e7e1627426ddc3b9b88ce760ff5ba7b6804a9377dc62c10f2 Apr 16 20:27:34.558664 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.558648 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gpdwg" Apr 16 20:27:34.564179 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:34.564157 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0df945b0_6094_4a50_aa50_80559da5aea1.slice/crio-4a30981381e13e693647a1e2ca205dd4915f935d5c8a81ada24e079b8deddd57 WatchSource:0}: Error finding container 4a30981381e13e693647a1e2ca205dd4915f935d5c8a81ada24e079b8deddd57: Status 404 returned error can't find the container with id 4a30981381e13e693647a1e2ca205dd4915f935d5c8a81ada24e079b8deddd57 Apr 16 20:27:34.575660 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.575641 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:34.581640 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:34.581619 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a1c475f_d8da_4de3_9d2f_33da4c16e0fa.slice/crio-0e2df578aab9d222fa510e64a9df32d40c032d183f6f34e46e84461b649f03c2 WatchSource:0}: Error finding container 0e2df578aab9d222fa510e64a9df32d40c032d183f6f34e46e84461b649f03c2: Status 404 returned error can't find the container with id 0e2df578aab9d222fa510e64a9df32d40c032d183f6f34e46e84461b649f03c2 Apr 16 20:27:34.602564 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.602549 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9tzkv" Apr 16 20:27:34.608083 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.608064 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" Apr 16 20:27:34.608532 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:34.608510 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8211bddd_6e75_435d_865a_caf384cfefad.slice/crio-5afa00e563591f89c1c73b4e0e17b753c8948a4aeb1c9a8b64e003c4fd877d59 WatchSource:0}: Error finding container 5afa00e563591f89c1c73b4e0e17b753c8948a4aeb1c9a8b64e003c4fd877d59: Status 404 returned error can't find the container with id 5afa00e563591f89c1c73b4e0e17b753c8948a4aeb1c9a8b64e003c4fd877d59 Apr 16 20:27:34.613972 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.613953 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jk4m8" Apr 16 20:27:34.614237 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:34.614221 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod989b9e2e_3c3c_4bd4_a883_e8af0168b1fa.slice/crio-a0ee773a2e0af033f45ef15ce0d759ba37a79289d6a04a35007aa2c7e428c862 WatchSource:0}: Error finding container a0ee773a2e0af033f45ef15ce0d759ba37a79289d6a04a35007aa2c7e428c862: Status 404 returned error can't find the container with id a0ee773a2e0af033f45ef15ce0d759ba37a79289d6a04a35007aa2c7e428c862 Apr 16 20:27:34.619428 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:27:34.619401 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41119468_3774_48bd_98be_d49ab3625162.slice/crio-a538f801182cbfb7ffa1c861e1f2d3c4c22956204cac8f6df0e5c230afebe89d WatchSource:0}: Error finding container a538f801182cbfb7ffa1c861e1f2d3c4c22956204cac8f6df0e5c230afebe89d: Status 404 returned error can't find the container with id a538f801182cbfb7ffa1c861e1f2d3c4c22956204cac8f6df0e5c230afebe89d Apr 16 20:27:34.884378 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.884351 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs\") pod \"network-metrics-daemon-b8p9v\" (UID: \"30274609-546d-4c7b-abd0-8907fd0a6cd7\") " pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:34.884558 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:34.884397 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkh4x\" (UniqueName: \"kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x\") pod \"network-check-target-22jqt\" (UID: \"fc79cedd-f56e-4d89-bb14-fc539e3148fb\") " pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:34.884558 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:34.884518 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:34.884558 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:34.884541 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:34.884558 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:34.884560 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:34.884777 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:34.884572 2577 projected.go:194] Error preparing data for projected volume kube-api-access-wkh4x for pod openshift-network-diagnostics/network-check-target-22jqt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:34.884777 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:34.884584 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs podName:30274609-546d-4c7b-abd0-8907fd0a6cd7 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:35.884569701 +0000 UTC m=+3.153490094 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs") pod "network-metrics-daemon-b8p9v" (UID: "30274609-546d-4c7b-abd0-8907fd0a6cd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:34.884777 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:34.884620 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x podName:fc79cedd-f56e-4d89-bb14-fc539e3148fb nodeName:}" failed. No retries permitted until 2026-04-16 20:27:35.884602969 +0000 UTC m=+3.153523357 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wkh4x" (UniqueName: "kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x") pod "network-check-target-22jqt" (UID: "fc79cedd-f56e-4d89-bb14-fc539e3148fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:35.212803 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.212707 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:22:34 +0000 UTC" deadline="2027-09-15 23:21:28.722993477 +0000 UTC" Apr 16 20:27:35.212803 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.212746 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12410h53m53.510251803s" Apr 16 20:27:35.258495 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.258096 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:35.321026 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.320961 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jk4m8" event={"ID":"41119468-3774-48bd-98be-d49ab3625162","Type":"ContainerStarted","Data":"a538f801182cbfb7ffa1c861e1f2d3c4c22956204cac8f6df0e5c230afebe89d"} Apr 16 20:27:35.341996 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.341953 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" event={"ID":"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa","Type":"ContainerStarted","Data":"0e2df578aab9d222fa510e64a9df32d40c032d183f6f34e46e84461b649f03c2"} Apr 16 20:27:35.353137 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.353091 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:35.362978 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.362949 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" event={"ID":"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c","Type":"ContainerStarted","Data":"ce4b832d46c2603e7e1627426ddc3b9b88ce760ff5ba7b6804a9377dc62c10f2"} Apr 16 20:27:35.380634 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.380605 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-m848r" event={"ID":"8df02eba-eb01-4603-87bd-76a281217485","Type":"ContainerStarted","Data":"5fc5b1e190fbbff825f08f6a22bfdfe8de8f275ca25cfb8b38c9eda5380953b1"} Apr 16 20:27:35.402639 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.402582 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-101.ec2.internal" event={"ID":"5e83c939a004f23d83533d2ed83f17f1","Type":"ContainerStarted","Data":"9e8bb6674f34b93c6416d9eb106a577531d275b464e4bdfb32d9293a7742d6e9"} Apr 16 20:27:35.430774 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.428515 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" event={"ID":"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa","Type":"ContainerStarted","Data":"a0ee773a2e0af033f45ef15ce0d759ba37a79289d6a04a35007aa2c7e428c862"} Apr 16 20:27:35.444350 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.444313 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9tzkv" event={"ID":"8211bddd-6e75-435d-865a-caf384cfefad","Type":"ContainerStarted","Data":"5afa00e563591f89c1c73b4e0e17b753c8948a4aeb1c9a8b64e003c4fd877d59"} Apr 16 20:27:35.450447 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.450346 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gpdwg" event={"ID":"0df945b0-6094-4a50-aa50-80559da5aea1","Type":"ContainerStarted","Data":"4a30981381e13e693647a1e2ca205dd4915f935d5c8a81ada24e079b8deddd57"} Apr 16 20:27:35.455871 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.455800 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vvztc" event={"ID":"ffb263cb-6f76-4dfe-a02a-435624b83457","Type":"ContainerStarted","Data":"0bc1eb22c036f53a0edf7418b6a5b440f96413a05b96cc08acbc3b29f8343fce"} Apr 16 20:27:35.473672 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.473560 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-264w9" event={"ID":"a8845fea-ade3-4e74-b157-294175ce8b1a","Type":"ContainerStarted","Data":"8085cc03eb56e6b7b2f2e373254ceea202aa5436ced7c776cc6cbd989c44aacc"} Apr 16 20:27:35.492162 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.491952 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal" event={"ID":"99f8167536275bbcb60afc6c9f189a44","Type":"ContainerStarted","Data":"85bdaa0eb7cdd3edbc347843792143c26cc9968792ac814e2ffaa1f2406a760f"} Apr 16 20:27:35.593218 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.593185 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:27:35.892514 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.892388 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkh4x\" (UniqueName: \"kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x\") pod \"network-check-target-22jqt\" (UID: \"fc79cedd-f56e-4d89-bb14-fc539e3148fb\") " pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:35.892514 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:35.892486 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs\") pod \"network-metrics-daemon-b8p9v\" (UID: \"30274609-546d-4c7b-abd0-8907fd0a6cd7\") " pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:35.892725 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:35.892569 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:35.892725 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:35.892593 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:35.892725 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:35.892597 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:35.892725 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:35.892613 2577 projected.go:194] Error preparing data for projected volume kube-api-access-wkh4x for pod openshift-network-diagnostics/network-check-target-22jqt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:35.892725 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:35.892649 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs podName:30274609-546d-4c7b-abd0-8907fd0a6cd7 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:37.892631315 +0000 UTC m=+5.161551706 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs") pod "network-metrics-daemon-b8p9v" (UID: "30274609-546d-4c7b-abd0-8907fd0a6cd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:35.892725 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:35.892676 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x podName:fc79cedd-f56e-4d89-bb14-fc539e3148fb nodeName:}" failed. No retries permitted until 2026-04-16 20:27:37.892658262 +0000 UTC m=+5.161578656 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wkh4x" (UniqueName: "kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x") pod "network-check-target-22jqt" (UID: "fc79cedd-f56e-4d89-bb14-fc539e3148fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:36.213635 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:36.213476 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:22:34 +0000 UTC" deadline="2027-10-01 04:50:23.766658809 +0000 UTC" Apr 16 20:27:36.213635 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:36.213515 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12776h22m47.553147977s" Apr 16 20:27:36.280362 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:36.280182 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:36.280362 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:36.280350 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-22jqt" podUID="fc79cedd-f56e-4d89-bb14-fc539e3148fb" Apr 16 20:27:36.280623 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:36.280497 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:36.280623 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:36.280610 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b8p9v" podUID="30274609-546d-4c7b-abd0-8907fd0a6cd7" Apr 16 20:27:37.502621 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:37.502589 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-f6vnk"] Apr 16 20:27:37.507021 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:37.507000 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:37.507138 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:37.507084 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f6vnk" podUID="3412a3da-a76f-4f08-b537-12d8e7e96c9d" Apr 16 20:27:37.605557 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:37.605523 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3412a3da-a76f-4f08-b537-12d8e7e96c9d-kubelet-config\") pod \"global-pull-secret-syncer-f6vnk\" (UID: \"3412a3da-a76f-4f08-b537-12d8e7e96c9d\") " pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:37.605736 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:37.605571 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3412a3da-a76f-4f08-b537-12d8e7e96c9d-dbus\") pod \"global-pull-secret-syncer-f6vnk\" (UID: \"3412a3da-a76f-4f08-b537-12d8e7e96c9d\") " pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:37.605736 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:37.605672 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret\") pod \"global-pull-secret-syncer-f6vnk\" (UID: \"3412a3da-a76f-4f08-b537-12d8e7e96c9d\") " pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:37.707589 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:37.706870 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret\") pod \"global-pull-secret-syncer-f6vnk\" (UID: \"3412a3da-a76f-4f08-b537-12d8e7e96c9d\") " pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:37.707589 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:37.706961 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3412a3da-a76f-4f08-b537-12d8e7e96c9d-kubelet-config\") pod \"global-pull-secret-syncer-f6vnk\" (UID: \"3412a3da-a76f-4f08-b537-12d8e7e96c9d\") " pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:37.707589 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:37.706988 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3412a3da-a76f-4f08-b537-12d8e7e96c9d-dbus\") pod \"global-pull-secret-syncer-f6vnk\" (UID: \"3412a3da-a76f-4f08-b537-12d8e7e96c9d\") " pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:37.707589 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:37.707172 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3412a3da-a76f-4f08-b537-12d8e7e96c9d-dbus\") pod \"global-pull-secret-syncer-f6vnk\" (UID: \"3412a3da-a76f-4f08-b537-12d8e7e96c9d\") " pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:37.707589 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:37.707276 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:37.707589 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:37.707331 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret podName:3412a3da-a76f-4f08-b537-12d8e7e96c9d nodeName:}" failed. No retries permitted until 2026-04-16 20:27:38.207311712 +0000 UTC m=+5.476232105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret") pod "global-pull-secret-syncer-f6vnk" (UID: "3412a3da-a76f-4f08-b537-12d8e7e96c9d") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:37.707589 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:37.707549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3412a3da-a76f-4f08-b537-12d8e7e96c9d-kubelet-config\") pod \"global-pull-secret-syncer-f6vnk\" (UID: \"3412a3da-a76f-4f08-b537-12d8e7e96c9d\") " pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:37.908648 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:37.908560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs\") pod \"network-metrics-daemon-b8p9v\" (UID: \"30274609-546d-4c7b-abd0-8907fd0a6cd7\") " pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:37.908648 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:37.908633 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkh4x\" (UniqueName: \"kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x\") pod \"network-check-target-22jqt\" (UID: \"fc79cedd-f56e-4d89-bb14-fc539e3148fb\") " pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:37.908862 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:37.908811 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:37.908904 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:37.908886 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs podName:30274609-546d-4c7b-abd0-8907fd0a6cd7 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:41.908864766 +0000 UTC m=+9.177785160 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs") pod "network-metrics-daemon-b8p9v" (UID: "30274609-546d-4c7b-abd0-8907fd0a6cd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:37.908904 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:37.908820 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:37.908969 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:37.908915 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:37.908969 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:37.908929 2577 projected.go:194] Error preparing data for projected volume kube-api-access-wkh4x for pod openshift-network-diagnostics/network-check-target-22jqt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:37.909032 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:37.908982 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x podName:fc79cedd-f56e-4d89-bb14-fc539e3148fb nodeName:}" failed. No retries permitted until 2026-04-16 20:27:41.908963718 +0000 UTC m=+9.177884113 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wkh4x" (UniqueName: "kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x") pod "network-check-target-22jqt" (UID: "fc79cedd-f56e-4d89-bb14-fc539e3148fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:38.211959 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:38.211859 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret\") pod \"global-pull-secret-syncer-f6vnk\" (UID: \"3412a3da-a76f-4f08-b537-12d8e7e96c9d\") " pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:38.212121 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:38.212040 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:38.212121 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:38.212103 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret podName:3412a3da-a76f-4f08-b537-12d8e7e96c9d nodeName:}" failed. No retries permitted until 2026-04-16 20:27:39.21208396 +0000 UTC m=+6.481004349 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret") pod "global-pull-secret-syncer-f6vnk" (UID: "3412a3da-a76f-4f08-b537-12d8e7e96c9d") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:38.280477 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:38.280141 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:38.280477 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:38.280150 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:38.280477 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:38.280259 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-22jqt" podUID="fc79cedd-f56e-4d89-bb14-fc539e3148fb" Apr 16 20:27:38.280477 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:38.280361 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b8p9v" podUID="30274609-546d-4c7b-abd0-8907fd0a6cd7" Apr 16 20:27:39.221306 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:39.221262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret\") pod \"global-pull-secret-syncer-f6vnk\" (UID: \"3412a3da-a76f-4f08-b537-12d8e7e96c9d\") " pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:39.221938 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:39.221919 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:39.222012 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:39.221990 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret podName:3412a3da-a76f-4f08-b537-12d8e7e96c9d nodeName:}" failed. No retries permitted until 2026-04-16 20:27:41.221971511 +0000 UTC m=+8.490891904 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret") pod "global-pull-secret-syncer-f6vnk" (UID: "3412a3da-a76f-4f08-b537-12d8e7e96c9d") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:39.280355 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:39.280018 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:39.280355 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:39.280161 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f6vnk" podUID="3412a3da-a76f-4f08-b537-12d8e7e96c9d" Apr 16 20:27:40.280380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:40.279883 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:40.280380 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:40.280028 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b8p9v" podUID="30274609-546d-4c7b-abd0-8907fd0a6cd7" Apr 16 20:27:40.280380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:40.280273 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:40.280380 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:40.280375 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-22jqt" podUID="fc79cedd-f56e-4d89-bb14-fc539e3148fb" Apr 16 20:27:41.239897 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:41.239859 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret\") pod \"global-pull-secret-syncer-f6vnk\" (UID: \"3412a3da-a76f-4f08-b537-12d8e7e96c9d\") " pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:41.240048 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:41.239972 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:41.240048 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:41.240043 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret podName:3412a3da-a76f-4f08-b537-12d8e7e96c9d nodeName:}" failed. No retries permitted until 2026-04-16 20:27:45.240021282 +0000 UTC m=+12.508941691 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret") pod "global-pull-secret-syncer-f6vnk" (UID: "3412a3da-a76f-4f08-b537-12d8e7e96c9d") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:41.280444 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:41.280409 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:41.280875 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:41.280576 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f6vnk" podUID="3412a3da-a76f-4f08-b537-12d8e7e96c9d" Apr 16 20:27:41.945101 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:41.945055 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs\") pod \"network-metrics-daemon-b8p9v\" (UID: \"30274609-546d-4c7b-abd0-8907fd0a6cd7\") " pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:41.945288 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:41.945113 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkh4x\" (UniqueName: \"kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x\") pod \"network-check-target-22jqt\" (UID: \"fc79cedd-f56e-4d89-bb14-fc539e3148fb\") " pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:41.945288 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:41.945229 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:41.945288 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:41.945247 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:41.945288 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:41.945276 2577 projected.go:194] Error preparing data for projected volume kube-api-access-wkh4x for pod openshift-network-diagnostics/network-check-target-22jqt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:41.945548 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:41.945388 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x podName:fc79cedd-f56e-4d89-bb14-fc539e3148fb nodeName:}" failed. No retries permitted until 2026-04-16 20:27:49.945315826 +0000 UTC m=+17.214236221 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wkh4x" (UniqueName: "kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x") pod "network-check-target-22jqt" (UID: "fc79cedd-f56e-4d89-bb14-fc539e3148fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:41.945759 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:41.945731 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:41.945887 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:41.945809 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs podName:30274609-546d-4c7b-abd0-8907fd0a6cd7 nodeName:}" failed. No retries permitted until 2026-04-16 20:27:49.945789654 +0000 UTC m=+17.214710051 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs") pod "network-metrics-daemon-b8p9v" (UID: "30274609-546d-4c7b-abd0-8907fd0a6cd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:42.280082 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:42.280005 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:42.280254 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:42.280005 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:42.280254 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:42.280154 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b8p9v" podUID="30274609-546d-4c7b-abd0-8907fd0a6cd7" Apr 16 20:27:42.280254 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:42.280184 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-22jqt" podUID="fc79cedd-f56e-4d89-bb14-fc539e3148fb" Apr 16 20:27:43.280505 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:43.280458 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:43.280946 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:43.280582 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f6vnk" podUID="3412a3da-a76f-4f08-b537-12d8e7e96c9d" Apr 16 20:27:44.280208 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:44.280177 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:44.280406 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:44.280179 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:44.280406 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:44.280312 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-22jqt" podUID="fc79cedd-f56e-4d89-bb14-fc539e3148fb" Apr 16 20:27:44.280406 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:44.280378 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b8p9v" podUID="30274609-546d-4c7b-abd0-8907fd0a6cd7" Apr 16 20:27:45.269041 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:45.268975 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret\") pod \"global-pull-secret-syncer-f6vnk\" (UID: \"3412a3da-a76f-4f08-b537-12d8e7e96c9d\") " pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:45.269244 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:45.269157 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:45.269244 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:45.269241 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret podName:3412a3da-a76f-4f08-b537-12d8e7e96c9d nodeName:}" failed. No retries permitted until 2026-04-16 20:27:53.269218578 +0000 UTC m=+20.538138969 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret") pod "global-pull-secret-syncer-f6vnk" (UID: "3412a3da-a76f-4f08-b537-12d8e7e96c9d") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:45.280440 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:45.280407 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:45.280607 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:45.280555 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f6vnk" podUID="3412a3da-a76f-4f08-b537-12d8e7e96c9d" Apr 16 20:27:46.280177 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:46.280136 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:46.280368 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:46.280258 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-22jqt" podUID="fc79cedd-f56e-4d89-bb14-fc539e3148fb" Apr 16 20:27:46.280368 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:46.280318 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:46.280509 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:46.280433 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b8p9v" podUID="30274609-546d-4c7b-abd0-8907fd0a6cd7" Apr 16 20:27:47.279936 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:47.279895 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:47.280428 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:47.280027 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f6vnk" podUID="3412a3da-a76f-4f08-b537-12d8e7e96c9d" Apr 16 20:27:48.280572 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:48.280292 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:48.280998 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:48.280631 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-22jqt" podUID="fc79cedd-f56e-4d89-bb14-fc539e3148fb" Apr 16 20:27:48.280998 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:48.280300 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:48.280998 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:48.280803 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b8p9v" podUID="30274609-546d-4c7b-abd0-8907fd0a6cd7" Apr 16 20:27:49.279871 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:49.279835 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:49.280054 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:49.279969 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f6vnk" podUID="3412a3da-a76f-4f08-b537-12d8e7e96c9d" Apr 16 20:27:50.001318 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:50.001277 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs\") pod \"network-metrics-daemon-b8p9v\" (UID: \"30274609-546d-4c7b-abd0-8907fd0a6cd7\") " pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:50.001853 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:50.001344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkh4x\" (UniqueName: \"kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x\") pod \"network-check-target-22jqt\" (UID: \"fc79cedd-f56e-4d89-bb14-fc539e3148fb\") " pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:50.001853 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:50.001482 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:50.001853 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:50.001497 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:27:50.001853 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:50.001516 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:27:50.001853 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:50.001530 2577 projected.go:194] Error preparing data for projected volume kube-api-access-wkh4x for pod openshift-network-diagnostics/network-check-target-22jqt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:50.001853 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:50.001565 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs podName:30274609-546d-4c7b-abd0-8907fd0a6cd7 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:06.00154289 +0000 UTC m=+33.270463288 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs") pod "network-metrics-daemon-b8p9v" (UID: "30274609-546d-4c7b-abd0-8907fd0a6cd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:27:50.001853 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:50.001585 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x podName:fc79cedd-f56e-4d89-bb14-fc539e3148fb nodeName:}" failed. No retries permitted until 2026-04-16 20:28:06.001576196 +0000 UTC m=+33.270496584 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wkh4x" (UniqueName: "kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x") pod "network-check-target-22jqt" (UID: "fc79cedd-f56e-4d89-bb14-fc539e3148fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:27:50.280120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:50.280046 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:50.280272 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:50.280049 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:50.280272 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:50.280166 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-22jqt" podUID="fc79cedd-f56e-4d89-bb14-fc539e3148fb" Apr 16 20:27:50.280272 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:50.280231 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b8p9v" podUID="30274609-546d-4c7b-abd0-8907fd0a6cd7" Apr 16 20:27:51.279797 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:51.279758 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:51.280271 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:51.279891 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f6vnk" podUID="3412a3da-a76f-4f08-b537-12d8e7e96c9d" Apr 16 20:27:52.279602 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:52.279564 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:52.279739 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:52.279564 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:52.279739 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:52.279675 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b8p9v" podUID="30274609-546d-4c7b-abd0-8907fd0a6cd7" Apr 16 20:27:52.279818 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:52.279734 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-22jqt" podUID="fc79cedd-f56e-4d89-bb14-fc539e3148fb" Apr 16 20:27:52.528994 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:52.528961 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" event={"ID":"c8ba1b5d-f5a6-46f9-9bb6-366595e5970c","Type":"ContainerStarted","Data":"961a7916ab6a95f5b2eafaff2b3529e988f6e91c16a7014e9dcc1c59c258ebe2"} Apr 16 20:27:52.531144 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:52.531120 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-101.ec2.internal" event={"ID":"5e83c939a004f23d83533d2ed83f17f1","Type":"ContainerStarted","Data":"0c2778f7f7571c578eee75800622c115f69e68f05bbb1366403dcbdec05ff822"} Apr 16 20:27:52.546017 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:52.545949 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dxcnl" podStartSLOduration=1.744629154 podStartE2EDuration="19.545930371s" podCreationTimestamp="2026-04-16 20:27:33 +0000 UTC" firstStartedPulling="2026-04-16 20:27:34.550081297 +0000 UTC m=+1.819001686" lastFinishedPulling="2026-04-16 20:27:52.351382509 +0000 UTC m=+19.620302903" observedRunningTime="2026-04-16 20:27:52.5455772 +0000 UTC m=+19.814497612" watchObservedRunningTime="2026-04-16 20:27:52.545930371 +0000 UTC m=+19.814850782" Apr 16 20:27:53.282729 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.282567 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:53.282902 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:53.282803 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f6vnk" podUID="3412a3da-a76f-4f08-b537-12d8e7e96c9d" Apr 16 20:27:53.322042 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.322011 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret\") pod \"global-pull-secret-syncer-f6vnk\" (UID: \"3412a3da-a76f-4f08-b537-12d8e7e96c9d\") " pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:53.322226 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:53.322197 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:53.322289 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:53.322263 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret podName:3412a3da-a76f-4f08-b537-12d8e7e96c9d nodeName:}" failed. No retries permitted until 2026-04-16 20:28:09.322244219 +0000 UTC m=+36.591164608 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret") pod "global-pull-secret-syncer-f6vnk" (UID: "3412a3da-a76f-4f08-b537-12d8e7e96c9d") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:27:53.533694 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.533657 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" event={"ID":"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa","Type":"ContainerStarted","Data":"c40b090a6a7ba692b00a6aa6777e1ba9a7968ccccc49da2360551e81380d510e"} Apr 16 20:27:53.534875 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.534851 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9tzkv" event={"ID":"8211bddd-6e75-435d-865a-caf384cfefad","Type":"ContainerStarted","Data":"ff0cc8eccea3626585d99e567760713e94e25f11d2dc03455abec31fbdbcd05e"} Apr 16 20:27:53.536071 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.536053 2577 generic.go:358] "Generic (PLEG): container finished" podID="ffb263cb-6f76-4dfe-a02a-435624b83457" containerID="fda2122f3ee0cbd8507b028e3f71700f079a000095461e5324ffd8a3dbf91033" exitCode=0 Apr 16 20:27:53.536169 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.536103 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vvztc" event={"ID":"ffb263cb-6f76-4dfe-a02a-435624b83457","Type":"ContainerDied","Data":"fda2122f3ee0cbd8507b028e3f71700f079a000095461e5324ffd8a3dbf91033"} Apr 16 20:27:53.537374 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.537355 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-264w9" event={"ID":"a8845fea-ade3-4e74-b157-294175ce8b1a","Type":"ContainerStarted","Data":"da33e739b872d8579248947ad3c0a55f0c60824ca61e931c8673e2a773275039"} Apr 16 20:27:53.538562 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.538539 2577 generic.go:358] "Generic (PLEG): container finished" podID="99f8167536275bbcb60afc6c9f189a44" containerID="23307a81c55fc7c16c7e828c24f0ee32529bb5f2044491462271f256db909540" exitCode=0 Apr 16 20:27:53.538656 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.538605 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal" event={"ID":"99f8167536275bbcb60afc6c9f189a44","Type":"ContainerDied","Data":"23307a81c55fc7c16c7e828c24f0ee32529bb5f2044491462271f256db909540"} Apr 16 20:27:53.539850 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.539826 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jk4m8" event={"ID":"41119468-3774-48bd-98be-d49ab3625162","Type":"ContainerStarted","Data":"37e430435b47b763b040c6cf2290bf4b84a406b76fbf5f3d23be43e88fa86b28"} Apr 16 20:27:53.544376 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.544362 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 20:27:53.544657 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.544640 2577 generic.go:358] "Generic (PLEG): container finished" podID="2a1c475f-d8da-4de3-9d2f-33da4c16e0fa" containerID="bd261103611efe7f4f408f3055911cb578e8223792a63983aa2430e5c835ec2c" exitCode=1 Apr 16 20:27:53.544711 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.544698 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" event={"ID":"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa","Type":"ContainerStarted","Data":"f5534692d8094401fb07df250f25df2bb2e253983f5f32b29bb5beb3f37db22c"} Apr 16 20:27:53.544744 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.544720 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" event={"ID":"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa","Type":"ContainerStarted","Data":"ea57ebd60d854c5f1a9ccc04256a2bcec9a6173f3a5b7c17199c0987347043f5"} Apr 16 20:27:53.544744 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.544730 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" event={"ID":"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa","Type":"ContainerStarted","Data":"2304660cd2f67f01f0b833d786be5185ffb2a2c1274bf7dcd656f1dcbc1f1106"} Apr 16 20:27:53.544744 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.544738 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" event={"ID":"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa","Type":"ContainerStarted","Data":"1791de99aa20dc0fb710d5ef1119417380edf6f40d1cb82f8baab16510c1561c"} Apr 16 20:27:53.544857 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.544747 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" event={"ID":"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa","Type":"ContainerStarted","Data":"836f8df69d6d9a96cbc4850bfa7c763cdea8a331c43b61012f956d73bf6629a7"} Apr 16 20:27:53.544857 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.544760 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" event={"ID":"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa","Type":"ContainerDied","Data":"bd261103611efe7f4f408f3055911cb578e8223792a63983aa2430e5c835ec2c"} Apr 16 20:27:53.545963 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.545943 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-m848r" event={"ID":"8df02eba-eb01-4603-87bd-76a281217485","Type":"ContainerStarted","Data":"8938c546d82ea26b689f4476f41604d76834d0b0e78e70b58b0da4ae5c952971"} Apr 16 20:27:53.548192 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.548164 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-101.ec2.internal" podStartSLOduration=20.548154999 podStartE2EDuration="20.548154999s" podCreationTimestamp="2026-04-16 20:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:27:52.559618214 +0000 UTC m=+19.828538625" watchObservedRunningTime="2026-04-16 20:27:53.548154999 +0000 UTC m=+20.817075408" Apr 16 20:27:53.560579 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.560545 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9tzkv" podStartSLOduration=2.8293570470000002 podStartE2EDuration="20.560536046s" podCreationTimestamp="2026-04-16 20:27:33 +0000 UTC" firstStartedPulling="2026-04-16 20:27:34.610678239 +0000 UTC m=+1.879598630" lastFinishedPulling="2026-04-16 20:27:52.341857226 +0000 UTC m=+19.610777629" observedRunningTime="2026-04-16 20:27:53.548268055 +0000 UTC m=+20.817188451" watchObservedRunningTime="2026-04-16 20:27:53.560536046 +0000 UTC m=+20.829456457" Apr 16 20:27:53.560720 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.560699 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-m848r" podStartSLOduration=10.493589863 podStartE2EDuration="20.56069426s" podCreationTimestamp="2026-04-16 20:27:33 +0000 UTC" firstStartedPulling="2026-04-16 20:27:34.505155076 +0000 UTC m=+1.774075465" lastFinishedPulling="2026-04-16 20:27:44.572259464 +0000 UTC m=+11.841179862" observedRunningTime="2026-04-16 20:27:53.5602528 +0000 UTC m=+20.829173210" watchObservedRunningTime="2026-04-16 20:27:53.56069426 +0000 UTC m=+20.829614670" Apr 16 20:27:53.594025 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.593978 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-264w9" podStartSLOduration=2.616533652 podStartE2EDuration="20.593964599s" podCreationTimestamp="2026-04-16 20:27:33 +0000 UTC" firstStartedPulling="2026-04-16 20:27:34.526061311 +0000 UTC m=+1.794981700" lastFinishedPulling="2026-04-16 20:27:52.503492242 +0000 UTC m=+19.772412647" observedRunningTime="2026-04-16 20:27:53.575211287 +0000 UTC m=+20.844131698" watchObservedRunningTime="2026-04-16 20:27:53.593964599 +0000 UTC m=+20.862885008" Apr 16 20:27:53.606977 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:53.606925 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jk4m8" podStartSLOduration=2.882200223 podStartE2EDuration="20.606907022s" podCreationTimestamp="2026-04-16 20:27:33 +0000 UTC" firstStartedPulling="2026-04-16 20:27:34.620940171 +0000 UTC m=+1.889860559" lastFinishedPulling="2026-04-16 20:27:52.345646955 +0000 UTC m=+19.614567358" observedRunningTime="2026-04-16 20:27:53.606444746 +0000 UTC m=+20.875365159" watchObservedRunningTime="2026-04-16 20:27:53.606907022 +0000 UTC m=+20.875827435" Apr 16 20:27:54.280035 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:54.280000 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:54.280169 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:54.280130 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-22jqt" podUID="fc79cedd-f56e-4d89-bb14-fc539e3148fb" Apr 16 20:27:54.280531 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:54.280000 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:54.280655 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:54.280634 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b8p9v" podUID="30274609-546d-4c7b-abd0-8907fd0a6cd7" Apr 16 20:27:54.337913 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:54.337883 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 20:27:54.549531 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:54.549394 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal" event={"ID":"99f8167536275bbcb60afc6c9f189a44","Type":"ContainerStarted","Data":"58f9a7c5690771008155d6a359280ab2002844dbd2e8d251ccc0060a3985e404"} Apr 16 20:27:54.551358 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:54.551325 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" event={"ID":"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa","Type":"ContainerStarted","Data":"3eaece30d7b914bcf5a116ba59427e13388d7aef446bb72495d388b4658c5201"} Apr 16 20:27:54.552840 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:54.552807 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gpdwg" event={"ID":"0df945b0-6094-4a50-aa50-80559da5aea1","Type":"ContainerStarted","Data":"8aa16672504c0c59994d8f5e3788ceffdc8d1ea9d7ffcd5b33272471c72e0111"} Apr 16 20:27:54.563606 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:54.563554 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-101.ec2.internal" podStartSLOduration=21.563535433 podStartE2EDuration="21.563535433s" podCreationTimestamp="2026-04-16 20:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:27:54.563285652 +0000 UTC m=+21.832206062" watchObservedRunningTime="2026-04-16 20:27:54.563535433 +0000 UTC m=+21.832455845" Apr 16 20:27:54.576098 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:54.576045 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-gpdwg" podStartSLOduration=3.789824188 podStartE2EDuration="21.576027642s" podCreationTimestamp="2026-04-16 20:27:33 +0000 UTC" firstStartedPulling="2026-04-16 20:27:34.565559075 +0000 UTC m=+1.834479462" lastFinishedPulling="2026-04-16 20:27:52.351762521 +0000 UTC m=+19.620682916" observedRunningTime="2026-04-16 20:27:54.575827739 +0000 UTC m=+21.844748149" watchObservedRunningTime="2026-04-16 20:27:54.576027642 +0000 UTC m=+21.844948053" Apr 16 20:27:55.091311 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:55.091275 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9tzkv" Apr 16 20:27:55.091987 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:55.091965 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9tzkv" Apr 16 20:27:55.247100 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:55.246985 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T20:27:54.337911295Z","UUID":"367a7ca8-ea13-4b26-98eb-4bbf2504a349","Handler":null,"Name":"","Endpoint":""} Apr 16 20:27:55.248886 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:55.248863 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 20:27:55.249036 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:55.248893 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 20:27:55.283113 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:55.283087 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:55.283228 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:55.283200 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f6vnk" podUID="3412a3da-a76f-4f08-b537-12d8e7e96c9d" Apr 16 20:27:55.556640 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:55.556542 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" event={"ID":"989b9e2e-3c3c-4bd4-a883-e8af0168b1fa","Type":"ContainerStarted","Data":"74354d6a8cdea2051c135629c7d816422a2bd9aaeeea247c865540b02507943c"} Apr 16 20:27:55.559537 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:55.559516 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 20:27:55.560065 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:55.559956 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" event={"ID":"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa","Type":"ContainerStarted","Data":"9929f21990a823c3ccb4b475461cc2c056efc64ddd17419591acab802c392f04"} Apr 16 20:27:55.560525 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:55.560504 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9tzkv" Apr 16 20:27:55.561186 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:55.561168 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9tzkv" Apr 16 20:27:55.572655 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:55.572618 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-shf8g" podStartSLOduration=1.91855686 podStartE2EDuration="22.572608279s" podCreationTimestamp="2026-04-16 20:27:33 +0000 UTC" firstStartedPulling="2026-04-16 20:27:34.616622977 +0000 UTC m=+1.885543377" lastFinishedPulling="2026-04-16 20:27:55.270674403 +0000 UTC m=+22.539594796" observedRunningTime="2026-04-16 20:27:55.572313339 +0000 UTC m=+22.841233749" watchObservedRunningTime="2026-04-16 20:27:55.572608279 +0000 UTC m=+22.841528699" Apr 16 20:27:56.280125 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:56.280097 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:56.280303 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:56.280231 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b8p9v" podUID="30274609-546d-4c7b-abd0-8907fd0a6cd7" Apr 16 20:27:56.280303 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:56.280289 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:56.280425 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:56.280397 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-22jqt" podUID="fc79cedd-f56e-4d89-bb14-fc539e3148fb" Apr 16 20:27:57.280134 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:57.280054 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:57.280582 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:57.280151 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f6vnk" podUID="3412a3da-a76f-4f08-b537-12d8e7e96c9d" Apr 16 20:27:58.280066 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:58.280038 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:27:58.280199 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:58.280158 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-22jqt" podUID="fc79cedd-f56e-4d89-bb14-fc539e3148fb" Apr 16 20:27:58.280875 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:58.280202 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:27:58.280875 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:58.280293 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b8p9v" podUID="30274609-546d-4c7b-abd0-8907fd0a6cd7" Apr 16 20:27:58.568778 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:58.568692 2577 generic.go:358] "Generic (PLEG): container finished" podID="ffb263cb-6f76-4dfe-a02a-435624b83457" containerID="1edd2080f4d0c3e87d83955496e18b2c06d1cf18966ab51645fca9fbd8c1bf0a" exitCode=0 Apr 16 20:27:58.568914 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:58.568785 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vvztc" event={"ID":"ffb263cb-6f76-4dfe-a02a-435624b83457","Type":"ContainerDied","Data":"1edd2080f4d0c3e87d83955496e18b2c06d1cf18966ab51645fca9fbd8c1bf0a"} Apr 16 20:27:58.571623 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:58.571606 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 20:27:58.571959 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:58.571937 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" event={"ID":"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa","Type":"ContainerStarted","Data":"e2b260102dd84e56a905a906435ba2b1a8e18e9f4800756000f08d7f9b96d994"} Apr 16 20:27:58.572258 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:58.572242 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:58.572370 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:58.572356 2577 scope.go:117] "RemoveContainer" containerID="bd261103611efe7f4f408f3055911cb578e8223792a63983aa2430e5c835ec2c" Apr 16 20:27:58.587325 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:58.587304 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:59.279683 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:59.279616 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:27:59.279861 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:27:59.279751 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f6vnk" podUID="3412a3da-a76f-4f08-b537-12d8e7e96c9d" Apr 16 20:27:59.577604 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:59.577532 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 20:27:59.577980 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:59.577883 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" event={"ID":"2a1c475f-d8da-4de3-9d2f-33da4c16e0fa","Type":"ContainerStarted","Data":"157851f3ba33ff8341ccf22cefb8d9a620db4e18273db0f467766aef996bc49f"} Apr 16 20:27:59.578341 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:59.578312 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:59.578439 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:59.578351 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:59.593865 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:59.593837 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:27:59.605489 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:27:59.605429 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" podStartSLOduration=8.781697887 podStartE2EDuration="26.605414784s" podCreationTimestamp="2026-04-16 20:27:33 +0000 UTC" firstStartedPulling="2026-04-16 20:27:34.583099282 +0000 UTC m=+1.852019671" lastFinishedPulling="2026-04-16 20:27:52.406816177 +0000 UTC m=+19.675736568" observedRunningTime="2026-04-16 20:27:59.604043331 +0000 UTC m=+26.872963743" watchObservedRunningTime="2026-04-16 20:27:59.605414784 +0000 UTC m=+26.874335193" Apr 16 20:28:00.037026 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:00.036992 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-22jqt"] Apr 16 20:28:00.037160 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:00.037126 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:28:00.037225 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:00.037208 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-22jqt" podUID="fc79cedd-f56e-4d89-bb14-fc539e3148fb" Apr 16 20:28:00.040652 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:00.040617 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-f6vnk"] Apr 16 20:28:00.040791 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:00.040709 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:28:00.040791 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:00.040784 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f6vnk" podUID="3412a3da-a76f-4f08-b537-12d8e7e96c9d" Apr 16 20:28:00.041160 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:00.041142 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-b8p9v"] Apr 16 20:28:00.041246 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:00.041236 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:28:00.041331 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:00.041315 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b8p9v" podUID="30274609-546d-4c7b-abd0-8907fd0a6cd7" Apr 16 20:28:00.581478 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:00.581426 2577 generic.go:358] "Generic (PLEG): container finished" podID="ffb263cb-6f76-4dfe-a02a-435624b83457" containerID="4ca002f12caaf194db8fd9f7dafbad48a3ecc3dc5d7c9f24f4f85dcb060153d3" exitCode=0 Apr 16 20:28:00.581839 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:00.581512 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vvztc" event={"ID":"ffb263cb-6f76-4dfe-a02a-435624b83457","Type":"ContainerDied","Data":"4ca002f12caaf194db8fd9f7dafbad48a3ecc3dc5d7c9f24f4f85dcb060153d3"} Apr 16 20:28:01.279844 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:01.279817 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:28:01.279973 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:01.279919 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f6vnk" podUID="3412a3da-a76f-4f08-b537-12d8e7e96c9d" Apr 16 20:28:01.585457 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:01.585305 2577 generic.go:358] "Generic (PLEG): container finished" podID="ffb263cb-6f76-4dfe-a02a-435624b83457" containerID="d77805a59d561b909abde883c1350f12aec87d9a40dab9239ca4e03bc9744abe" exitCode=0 Apr 16 20:28:01.585840 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:01.585378 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vvztc" event={"ID":"ffb263cb-6f76-4dfe-a02a-435624b83457","Type":"ContainerDied","Data":"d77805a59d561b909abde883c1350f12aec87d9a40dab9239ca4e03bc9744abe"} Apr 16 20:28:02.280227 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:02.280194 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:28:02.280384 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:02.280194 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:28:02.280384 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:02.280330 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b8p9v" podUID="30274609-546d-4c7b-abd0-8907fd0a6cd7" Apr 16 20:28:02.280513 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:02.280419 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-22jqt" podUID="fc79cedd-f56e-4d89-bb14-fc539e3148fb" Apr 16 20:28:03.280054 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:03.280020 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:28:03.280508 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:03.280098 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f6vnk" podUID="3412a3da-a76f-4f08-b537-12d8e7e96c9d" Apr 16 20:28:04.279361 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.279325 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:28:04.279525 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.279326 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:28:04.279525 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:04.279455 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-22jqt" podUID="fc79cedd-f56e-4d89-bb14-fc539e3148fb" Apr 16 20:28:04.279654 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:04.279565 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b8p9v" podUID="30274609-546d-4c7b-abd0-8907fd0a6cd7" Apr 16 20:28:04.584232 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.584203 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-101.ec2.internal" event="NodeReady" Apr 16 20:28:04.584693 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.584338 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 20:28:04.623688 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.623654 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xbszw"] Apr 16 20:28:04.654967 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.654940 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-skwc8"] Apr 16 20:28:04.655143 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.655101 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:04.657972 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.657700 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 20:28:04.657972 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.657746 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 20:28:04.657972 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.657806 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w999j\"" Apr 16 20:28:04.675930 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.675901 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xbszw"] Apr 16 20:28:04.676047 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.675948 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-skwc8"] Apr 16 20:28:04.676047 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.675949 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-skwc8" Apr 16 20:28:04.678430 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.678405 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 20:28:04.678563 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.678434 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2cgzs\"" Apr 16 20:28:04.678563 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.678517 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 20:28:04.678563 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.678540 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 20:28:04.817129 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.817091 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert\") pod \"ingress-canary-skwc8\" (UID: \"f1de9e75-c8b2-4fee-898a-82488ff8d677\") " pod="openshift-ingress-canary/ingress-canary-skwc8" Apr 16 20:28:04.817129 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.817134 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2a1535c-c8dc-4688-a07f-00a01b4dec34-config-volume\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:04.817358 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.817155 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hjqp\" (UniqueName: \"kubernetes.io/projected/a2a1535c-c8dc-4688-a07f-00a01b4dec34-kube-api-access-7hjqp\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:04.817358 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.817231 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cg27\" (UniqueName: \"kubernetes.io/projected/f1de9e75-c8b2-4fee-898a-82488ff8d677-kube-api-access-6cg27\") pod \"ingress-canary-skwc8\" (UID: \"f1de9e75-c8b2-4fee-898a-82488ff8d677\") " pod="openshift-ingress-canary/ingress-canary-skwc8" Apr 16 20:28:04.817358 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.817288 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:04.817532 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.817360 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a2a1535c-c8dc-4688-a07f-00a01b4dec34-tmp-dir\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:04.918369 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.918340 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cg27\" (UniqueName: \"kubernetes.io/projected/f1de9e75-c8b2-4fee-898a-82488ff8d677-kube-api-access-6cg27\") pod \"ingress-canary-skwc8\" (UID: \"f1de9e75-c8b2-4fee-898a-82488ff8d677\") " pod="openshift-ingress-canary/ingress-canary-skwc8" Apr 16 20:28:04.918547 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.918380 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:04.918547 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.918404 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a2a1535c-c8dc-4688-a07f-00a01b4dec34-tmp-dir\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:04.918547 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.918422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert\") pod \"ingress-canary-skwc8\" (UID: \"f1de9e75-c8b2-4fee-898a-82488ff8d677\") " pod="openshift-ingress-canary/ingress-canary-skwc8" Apr 16 20:28:04.918547 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.918441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2a1535c-c8dc-4688-a07f-00a01b4dec34-config-volume\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:04.918547 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.918459 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hjqp\" (UniqueName: \"kubernetes.io/projected/a2a1535c-c8dc-4688-a07f-00a01b4dec34-kube-api-access-7hjqp\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:04.918821 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:04.918538 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:04.918821 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:04.918538 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:04.918821 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:04.918611 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls podName:a2a1535c-c8dc-4688-a07f-00a01b4dec34 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:05.418591937 +0000 UTC m=+32.687512330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls") pod "dns-default-xbszw" (UID: "a2a1535c-c8dc-4688-a07f-00a01b4dec34") : secret "dns-default-metrics-tls" not found Apr 16 20:28:04.918821 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:04.918666 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert podName:f1de9e75-c8b2-4fee-898a-82488ff8d677 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:05.41864606 +0000 UTC m=+32.687566462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert") pod "ingress-canary-skwc8" (UID: "f1de9e75-c8b2-4fee-898a-82488ff8d677") : secret "canary-serving-cert" not found Apr 16 20:28:04.918821 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.918785 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a2a1535c-c8dc-4688-a07f-00a01b4dec34-tmp-dir\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:04.928442 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.928422 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cg27\" (UniqueName: \"kubernetes.io/projected/f1de9e75-c8b2-4fee-898a-82488ff8d677-kube-api-access-6cg27\") pod \"ingress-canary-skwc8\" (UID: \"f1de9e75-c8b2-4fee-898a-82488ff8d677\") " pod="openshift-ingress-canary/ingress-canary-skwc8" Apr 16 20:28:04.929591 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.929575 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2a1535c-c8dc-4688-a07f-00a01b4dec34-config-volume\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:04.936731 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:04.936713 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hjqp\" (UniqueName: \"kubernetes.io/projected/a2a1535c-c8dc-4688-a07f-00a01b4dec34-kube-api-access-7hjqp\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:05.279906 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:05.279823 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:28:05.282581 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:05.282558 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:28:05.422804 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:05.422764 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:05.422997 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:05.422826 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert\") pod \"ingress-canary-skwc8\" (UID: \"f1de9e75-c8b2-4fee-898a-82488ff8d677\") " pod="openshift-ingress-canary/ingress-canary-skwc8" Apr 16 20:28:05.422997 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:05.422951 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:05.422997 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:05.422994 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:05.423126 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:05.423019 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert podName:f1de9e75-c8b2-4fee-898a-82488ff8d677 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:06.423000589 +0000 UTC m=+33.691920977 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert") pod "ingress-canary-skwc8" (UID: "f1de9e75-c8b2-4fee-898a-82488ff8d677") : secret "canary-serving-cert" not found Apr 16 20:28:05.423126 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:05.423036 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls podName:a2a1535c-c8dc-4688-a07f-00a01b4dec34 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:06.423027334 +0000 UTC m=+33.691947724 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls") pod "dns-default-xbszw" (UID: "a2a1535c-c8dc-4688-a07f-00a01b4dec34") : secret "dns-default-metrics-tls" not found Apr 16 20:28:06.028540 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:06.028504 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs\") pod \"network-metrics-daemon-b8p9v\" (UID: \"30274609-546d-4c7b-abd0-8907fd0a6cd7\") " pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:28:06.029231 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:06.028563 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkh4x\" (UniqueName: \"kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x\") pod \"network-check-target-22jqt\" (UID: \"fc79cedd-f56e-4d89-bb14-fc539e3148fb\") " pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:28:06.029231 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:06.028676 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:28:06.029231 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:06.028706 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:28:06.029231 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:06.028721 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:28:06.029231 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:06.028731 2577 projected.go:194] Error preparing data for projected volume kube-api-access-wkh4x for pod openshift-network-diagnostics/network-check-target-22jqt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:28:06.029231 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:06.028739 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs podName:30274609-546d-4c7b-abd0-8907fd0a6cd7 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:38.028724782 +0000 UTC m=+65.297645175 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs") pod "network-metrics-daemon-b8p9v" (UID: "30274609-546d-4c7b-abd0-8907fd0a6cd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:28:06.029231 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:06.028774 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x podName:fc79cedd-f56e-4d89-bb14-fc539e3148fb nodeName:}" failed. No retries permitted until 2026-04-16 20:28:38.028762901 +0000 UTC m=+65.297683289 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-wkh4x" (UniqueName: "kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x") pod "network-check-target-22jqt" (UID: "fc79cedd-f56e-4d89-bb14-fc539e3148fb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:28:06.279846 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:06.279771 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:28:06.280104 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:06.279771 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:28:06.282385 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:06.282360 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:28:06.283353 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:06.283324 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:28:06.283353 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:06.283338 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jwldw\"" Apr 16 20:28:06.283353 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:06.283346 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:28:06.283579 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:06.283338 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v8bd2\"" Apr 16 20:28:06.431614 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:06.431577 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:06.431770 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:06.431623 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert\") pod \"ingress-canary-skwc8\" (UID: \"f1de9e75-c8b2-4fee-898a-82488ff8d677\") " pod="openshift-ingress-canary/ingress-canary-skwc8" Apr 16 20:28:06.431770 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:06.431739 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:06.431848 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:06.431792 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:06.431848 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:06.431810 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls podName:a2a1535c-c8dc-4688-a07f-00a01b4dec34 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:08.431791232 +0000 UTC m=+35.700711625 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls") pod "dns-default-xbszw" (UID: "a2a1535c-c8dc-4688-a07f-00a01b4dec34") : secret "dns-default-metrics-tls" not found Apr 16 20:28:06.431927 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:06.431855 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert podName:f1de9e75-c8b2-4fee-898a-82488ff8d677 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:08.431841852 +0000 UTC m=+35.700762240 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert") pod "ingress-canary-skwc8" (UID: "f1de9e75-c8b2-4fee-898a-82488ff8d677") : secret "canary-serving-cert" not found Apr 16 20:28:08.447768 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:08.447732 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:08.447768 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:08.447779 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert\") pod \"ingress-canary-skwc8\" (UID: \"f1de9e75-c8b2-4fee-898a-82488ff8d677\") " pod="openshift-ingress-canary/ingress-canary-skwc8" Apr 16 20:28:08.448376 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:08.447896 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:08.448376 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:08.447929 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:08.448376 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:08.447971 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls podName:a2a1535c-c8dc-4688-a07f-00a01b4dec34 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:12.447951775 +0000 UTC m=+39.716872163 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls") pod "dns-default-xbszw" (UID: "a2a1535c-c8dc-4688-a07f-00a01b4dec34") : secret "dns-default-metrics-tls" not found Apr 16 20:28:08.448376 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:08.447987 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert podName:f1de9e75-c8b2-4fee-898a-82488ff8d677 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:12.447981261 +0000 UTC m=+39.716901649 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert") pod "ingress-canary-skwc8" (UID: "f1de9e75-c8b2-4fee-898a-82488ff8d677") : secret "canary-serving-cert" not found Apr 16 20:28:09.356642 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:09.356606 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret\") pod \"global-pull-secret-syncer-f6vnk\" (UID: \"3412a3da-a76f-4f08-b537-12d8e7e96c9d\") " pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:28:09.358935 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:09.358905 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3412a3da-a76f-4f08-b537-12d8e7e96c9d-original-pull-secret\") pod \"global-pull-secret-syncer-f6vnk\" (UID: \"3412a3da-a76f-4f08-b537-12d8e7e96c9d\") " pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:28:09.490907 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:09.490885 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f6vnk" Apr 16 20:28:09.634584 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:09.634376 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-f6vnk"] Apr 16 20:28:09.637316 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:28:09.637292 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3412a3da_a76f_4f08_b537_12d8e7e96c9d.slice/crio-4984747b1ed064c4942ec14e0373c8cdc05743d6aace5fde72245302146e0ca6 WatchSource:0}: Error finding container 4984747b1ed064c4942ec14e0373c8cdc05743d6aace5fde72245302146e0ca6: Status 404 returned error can't find the container with id 4984747b1ed064c4942ec14e0373c8cdc05743d6aace5fde72245302146e0ca6 Apr 16 20:28:10.604485 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:10.604321 2577 generic.go:358] "Generic (PLEG): container finished" podID="ffb263cb-6f76-4dfe-a02a-435624b83457" containerID="d541cfab3e21463160204332c50ed4a21ea7447271cd64831365b7b6462ddec9" exitCode=0 Apr 16 20:28:10.604485 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:10.604429 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vvztc" event={"ID":"ffb263cb-6f76-4dfe-a02a-435624b83457","Type":"ContainerDied","Data":"d541cfab3e21463160204332c50ed4a21ea7447271cd64831365b7b6462ddec9"} Apr 16 20:28:10.606563 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:10.606534 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-f6vnk" event={"ID":"3412a3da-a76f-4f08-b537-12d8e7e96c9d","Type":"ContainerStarted","Data":"4984747b1ed064c4942ec14e0373c8cdc05743d6aace5fde72245302146e0ca6"} Apr 16 20:28:11.611652 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:11.611613 2577 generic.go:358] "Generic (PLEG): container finished" podID="ffb263cb-6f76-4dfe-a02a-435624b83457" containerID="8cb20631fe018aa0c18ca5a0f270e40c8c32d0d0ccec3700cc5077c2018c9772" exitCode=0 Apr 16 20:28:11.612078 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:11.611684 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vvztc" event={"ID":"ffb263cb-6f76-4dfe-a02a-435624b83457","Type":"ContainerDied","Data":"8cb20631fe018aa0c18ca5a0f270e40c8c32d0d0ccec3700cc5077c2018c9772"} Apr 16 20:28:12.481720 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:12.481677 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:12.481720 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:12.481732 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert\") pod \"ingress-canary-skwc8\" (UID: \"f1de9e75-c8b2-4fee-898a-82488ff8d677\") " pod="openshift-ingress-canary/ingress-canary-skwc8" Apr 16 20:28:12.481974 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:12.481839 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:12.481974 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:12.481854 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:12.481974 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:12.481910 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert podName:f1de9e75-c8b2-4fee-898a-82488ff8d677 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:20.481893838 +0000 UTC m=+47.750814226 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert") pod "ingress-canary-skwc8" (UID: "f1de9e75-c8b2-4fee-898a-82488ff8d677") : secret "canary-serving-cert" not found Apr 16 20:28:12.481974 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:12.481924 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls podName:a2a1535c-c8dc-4688-a07f-00a01b4dec34 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:20.48191857 +0000 UTC m=+47.750838958 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls") pod "dns-default-xbszw" (UID: "a2a1535c-c8dc-4688-a07f-00a01b4dec34") : secret "dns-default-metrics-tls" not found Apr 16 20:28:12.617888 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:12.617850 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vvztc" event={"ID":"ffb263cb-6f76-4dfe-a02a-435624b83457","Type":"ContainerStarted","Data":"c591d35c920de3d904a82920f3474b76e37249f6e480a8c9f2eca086123b26e7"} Apr 16 20:28:12.645984 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:12.645930 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vvztc" podStartSLOduration=4.689844627 podStartE2EDuration="39.645908831s" podCreationTimestamp="2026-04-16 20:27:33 +0000 UTC" firstStartedPulling="2026-04-16 20:27:34.530654342 +0000 UTC m=+1.799574730" lastFinishedPulling="2026-04-16 20:28:09.486718535 +0000 UTC m=+36.755638934" observedRunningTime="2026-04-16 20:28:12.645056096 +0000 UTC m=+39.913976508" watchObservedRunningTime="2026-04-16 20:28:12.645908831 +0000 UTC m=+39.914829245" Apr 16 20:28:13.621311 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:13.621278 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-f6vnk" event={"ID":"3412a3da-a76f-4f08-b537-12d8e7e96c9d","Type":"ContainerStarted","Data":"65f5f7fd33bbee1af55073bd50a06c3e109ab84e4daeb9866257a07a27e12ca1"} Apr 16 20:28:13.636529 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:13.636431 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-f6vnk" podStartSLOduration=32.835731752 podStartE2EDuration="36.636415119s" podCreationTimestamp="2026-04-16 20:27:37 +0000 UTC" firstStartedPulling="2026-04-16 20:28:09.639061565 +0000 UTC m=+36.907981953" lastFinishedPulling="2026-04-16 20:28:13.439744912 +0000 UTC m=+40.708665320" observedRunningTime="2026-04-16 20:28:13.636079306 +0000 UTC m=+40.904999717" watchObservedRunningTime="2026-04-16 20:28:13.636415119 +0000 UTC m=+40.905335532" Apr 16 20:28:20.539163 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:20.539121 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:20.539555 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:20.539183 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert\") pod \"ingress-canary-skwc8\" (UID: \"f1de9e75-c8b2-4fee-898a-82488ff8d677\") " pod="openshift-ingress-canary/ingress-canary-skwc8" Apr 16 20:28:20.539555 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:20.539284 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:20.539555 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:20.539360 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls podName:a2a1535c-c8dc-4688-a07f-00a01b4dec34 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:36.539344093 +0000 UTC m=+63.808264485 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls") pod "dns-default-xbszw" (UID: "a2a1535c-c8dc-4688-a07f-00a01b4dec34") : secret "dns-default-metrics-tls" not found Apr 16 20:28:20.539555 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:20.539291 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:20.539555 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:20.539424 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert podName:f1de9e75-c8b2-4fee-898a-82488ff8d677 nodeName:}" failed. No retries permitted until 2026-04-16 20:28:36.539412364 +0000 UTC m=+63.808332752 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert") pod "ingress-canary-skwc8" (UID: "f1de9e75-c8b2-4fee-898a-82488ff8d677") : secret "canary-serving-cert" not found Apr 16 20:28:31.601536 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:31.601502 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5gm4x" Apr 16 20:28:36.547383 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:36.547343 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:28:36.547383 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:36.547386 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert\") pod \"ingress-canary-skwc8\" (UID: \"f1de9e75-c8b2-4fee-898a-82488ff8d677\") " pod="openshift-ingress-canary/ingress-canary-skwc8" Apr 16 20:28:36.547844 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:36.547507 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:28:36.547844 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:36.547570 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls podName:a2a1535c-c8dc-4688-a07f-00a01b4dec34 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:08.547553929 +0000 UTC m=+95.816474321 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls") pod "dns-default-xbszw" (UID: "a2a1535c-c8dc-4688-a07f-00a01b4dec34") : secret "dns-default-metrics-tls" not found Apr 16 20:28:36.547844 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:36.547513 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:28:36.547844 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:36.547650 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert podName:f1de9e75-c8b2-4fee-898a-82488ff8d677 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:08.547637835 +0000 UTC m=+95.816558224 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert") pod "ingress-canary-skwc8" (UID: "f1de9e75-c8b2-4fee-898a-82488ff8d677") : secret "canary-serving-cert" not found Apr 16 20:28:38.058220 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:38.058179 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs\") pod \"network-metrics-daemon-b8p9v\" (UID: \"30274609-546d-4c7b-abd0-8907fd0a6cd7\") " pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:28:38.058629 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:38.058225 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkh4x\" (UniqueName: \"kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x\") pod \"network-check-target-22jqt\" (UID: \"fc79cedd-f56e-4d89-bb14-fc539e3148fb\") " pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:28:38.061264 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:38.061247 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:28:38.061327 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:38.061282 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:28:38.068626 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:38.068604 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:28:38.068685 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:28:38.068675 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs podName:30274609-546d-4c7b-abd0-8907fd0a6cd7 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:42.068657094 +0000 UTC m=+129.337577482 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs") pod "network-metrics-daemon-b8p9v" (UID: "30274609-546d-4c7b-abd0-8907fd0a6cd7") : secret "metrics-daemon-secret" not found Apr 16 20:28:38.070788 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:38.070772 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:28:38.082101 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:38.082081 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkh4x\" (UniqueName: \"kubernetes.io/projected/fc79cedd-f56e-4d89-bb14-fc539e3148fb-kube-api-access-wkh4x\") pod \"network-check-target-22jqt\" (UID: \"fc79cedd-f56e-4d89-bb14-fc539e3148fb\") " pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:28:38.099661 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:38.099638 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jwldw\"" Apr 16 20:28:38.107533 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:38.107515 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:28:38.233032 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:38.232988 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-22jqt"] Apr 16 20:28:38.237324 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:28:38.237298 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc79cedd_f56e_4d89_bb14_fc539e3148fb.slice/crio-dd5774c47eb1dcde1890d0f1759c3756cc723e6df435cc38d1d6e8e5262291ae WatchSource:0}: Error finding container dd5774c47eb1dcde1890d0f1759c3756cc723e6df435cc38d1d6e8e5262291ae: Status 404 returned error can't find the container with id dd5774c47eb1dcde1890d0f1759c3756cc723e6df435cc38d1d6e8e5262291ae Apr 16 20:28:38.671649 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:38.671613 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-22jqt" event={"ID":"fc79cedd-f56e-4d89-bb14-fc539e3148fb","Type":"ContainerStarted","Data":"dd5774c47eb1dcde1890d0f1759c3756cc723e6df435cc38d1d6e8e5262291ae"} Apr 16 20:28:42.680265 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:42.680231 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-22jqt" event={"ID":"fc79cedd-f56e-4d89-bb14-fc539e3148fb","Type":"ContainerStarted","Data":"97777f36ddf6bfc33c7c98cadc5e2ebecf53962649736dc75ed9746e1e86d10d"} Apr 16 20:28:42.680651 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:42.680349 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:28:42.695156 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:28:42.695112 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-22jqt" podStartSLOduration=66.202701127 podStartE2EDuration="1m9.695097895s" podCreationTimestamp="2026-04-16 20:27:33 +0000 UTC" firstStartedPulling="2026-04-16 20:28:38.239097731 +0000 UTC m=+65.508018119" lastFinishedPulling="2026-04-16 20:28:41.731494485 +0000 UTC m=+69.000414887" observedRunningTime="2026-04-16 20:28:42.694312248 +0000 UTC m=+69.963232657" watchObservedRunningTime="2026-04-16 20:28:42.695097895 +0000 UTC m=+69.964018363" Apr 16 20:29:08.570964 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:08.570911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:29:08.570964 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:08.570964 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert\") pod \"ingress-canary-skwc8\" (UID: \"f1de9e75-c8b2-4fee-898a-82488ff8d677\") " pod="openshift-ingress-canary/ingress-canary-skwc8" Apr 16 20:29:08.571533 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:08.571048 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:29:08.571533 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:08.571053 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:29:08.571533 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:08.571109 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert podName:f1de9e75-c8b2-4fee-898a-82488ff8d677 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:12.571094774 +0000 UTC m=+159.840015162 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert") pod "ingress-canary-skwc8" (UID: "f1de9e75-c8b2-4fee-898a-82488ff8d677") : secret "canary-serving-cert" not found Apr 16 20:29:08.571533 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:08.571121 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls podName:a2a1535c-c8dc-4688-a07f-00a01b4dec34 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:12.571115097 +0000 UTC m=+159.840035485 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls") pod "dns-default-xbszw" (UID: "a2a1535c-c8dc-4688-a07f-00a01b4dec34") : secret "dns-default-metrics-tls" not found Apr 16 20:29:13.685338 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:13.685306 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-22jqt" Apr 16 20:29:42.104958 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:42.104906 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs\") pod \"network-metrics-daemon-b8p9v\" (UID: \"30274609-546d-4c7b-abd0-8907fd0a6cd7\") " pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:29:42.105455 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:42.105045 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:29:42.105455 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:42.105113 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs podName:30274609-546d-4c7b-abd0-8907fd0a6cd7 nodeName:}" failed. No retries permitted until 2026-04-16 20:31:44.105096725 +0000 UTC m=+251.374017118 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs") pod "network-metrics-daemon-b8p9v" (UID: "30274609-546d-4c7b-abd0-8907fd0a6cd7") : secret "metrics-daemon-secret" not found Apr 16 20:29:50.838355 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.838317 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-cc5b6bc9d-d9pl5"] Apr 16 20:29:50.841034 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.841018 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:50.843422 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.843397 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 20:29:50.843596 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.843428 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 20:29:50.843596 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.843560 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 20:29:50.843709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.843668 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 20:29:50.843787 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.843768 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 20:29:50.843850 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.843820 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-vfctk\"" Apr 16 20:29:50.844507 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.844492 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 20:29:50.852527 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.852506 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-cc5b6bc9d-d9pl5"] Apr 16 20:29:50.943128 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.943059 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-ps9ss"] Apr 16 20:29:50.945794 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.945777 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv"] Apr 16 20:29:50.945930 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.945912 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ps9ss" Apr 16 20:29:50.949147 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.948558 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-xgdxx\"" Apr 16 20:29:50.950328 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.950307 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c"] Apr 16 20:29:50.950498 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.950457 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv" Apr 16 20:29:50.952795 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.952773 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 20:29:50.952877 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.952845 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:29:50.952877 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.952775 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" Apr 16 20:29:50.952877 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.952868 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 20:29:50.953023 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.952926 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-lmgpc\"" Apr 16 20:29:50.955095 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.955077 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-wn84m\"" Apr 16 20:29:50.956518 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.956500 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 20:29:50.956617 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.956527 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 20:29:50.956704 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.956685 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 20:29:50.956844 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.956827 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 20:29:50.957703 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.957683 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-ps9ss"] Apr 16 20:29:50.958864 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.958831 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv"] Apr 16 20:29:50.959882 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.959863 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c"] Apr 16 20:29:50.963004 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.962985 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:50.963101 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.963037 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-stats-auth\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:50.963101 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.963070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:50.963213 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.963100 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvzcx\" (UniqueName: \"kubernetes.io/projected/b0125b55-3e0c-4bba-b620-3460a3974959-kube-api-access-bvzcx\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:50.963213 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:50.963170 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-default-certificate\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:51.035480 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.035444 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6"] Apr 16 20:29:51.038331 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.038308 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb"] Apr 16 20:29:51.038479 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.038442 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6" Apr 16 20:29:51.040627 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.040609 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-wkdbg\"" Apr 16 20:29:51.040702 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.040637 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 20:29:51.040843 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.040821 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:29:51.041087 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.041071 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 20:29:51.041149 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.041136 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 20:29:51.041657 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.041638 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb" Apr 16 20:29:51.043818 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.043799 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 20:29:51.043905 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.043848 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 20:29:51.044391 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.044373 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 20:29:51.044533 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.044453 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:29:51.044633 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.044490 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-8fcfg\"" Apr 16 20:29:51.044820 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.044802 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6"] Apr 16 20:29:51.051655 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.051633 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb"] Apr 16 20:29:51.063479 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.063441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-default-certificate\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:51.063613 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.063491 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpq9c\" (UID: \"da561835-fd55-453b-91fa-23a89f82a5f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" Apr 16 20:29:51.063613 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.063527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:51.063613 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.063545 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bc8x\" (UniqueName: \"kubernetes.io/projected/1d3dda55-4aed-4f36-8730-48e51f0a7145-kube-api-access-8bc8x\") pod \"cluster-samples-operator-6dc5bdb6b4-smjfv\" (UID: \"1d3dda55-4aed-4f36-8730-48e51f0a7145\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv" Apr 16 20:29:51.063613 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.063568 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh7w8\" (UniqueName: \"kubernetes.io/projected/da561835-fd55-453b-91fa-23a89f82a5f3-kube-api-access-lh7w8\") pod \"cluster-monitoring-operator-75587bd455-tpq9c\" (UID: \"da561835-fd55-453b-91fa-23a89f82a5f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" Apr 16 20:29:51.063613 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.063597 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-stats-auth\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:51.063847 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.063623 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:51.063847 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.063646 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-smjfv\" (UID: \"1d3dda55-4aed-4f36-8730-48e51f0a7145\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv" Apr 16 20:29:51.063847 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:51.063659 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:29:51.063847 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.063720 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvzcx\" (UniqueName: \"kubernetes.io/projected/b0125b55-3e0c-4bba-b620-3460a3974959-kube-api-access-bvzcx\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:51.063847 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.063748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/da561835-fd55-453b-91fa-23a89f82a5f3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tpq9c\" (UID: \"da561835-fd55-453b-91fa-23a89f82a5f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" Apr 16 20:29:51.063847 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:51.063784 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle podName:b0125b55-3e0c-4bba-b620-3460a3974959 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:51.563763536 +0000 UTC m=+138.832683951 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle") pod "router-default-cc5b6bc9d-d9pl5" (UID: "b0125b55-3e0c-4bba-b620-3460a3974959") : configmap references non-existent config key: service-ca.crt Apr 16 20:29:51.063847 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.063817 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnkdl\" (UniqueName: \"kubernetes.io/projected/785129b2-7510-46a7-b033-97db9eb277cd-kube-api-access-rnkdl\") pod \"network-check-source-8894fc9bd-ps9ss\" (UID: \"785129b2-7510-46a7-b033-97db9eb277cd\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ps9ss" Apr 16 20:29:51.064170 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:51.063870 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs podName:b0125b55-3e0c-4bba-b620-3460a3974959 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:51.563849099 +0000 UTC m=+138.832769493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs") pod "router-default-cc5b6bc9d-d9pl5" (UID: "b0125b55-3e0c-4bba-b620-3460a3974959") : secret "router-metrics-certs-default" not found Apr 16 20:29:51.065990 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.065971 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-default-certificate\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:51.066072 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.066035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-stats-auth\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:51.071292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.071275 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvzcx\" (UniqueName: \"kubernetes.io/projected/b0125b55-3e0c-4bba-b620-3460a3974959-kube-api-access-bvzcx\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:51.165084 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.164991 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zj2j\" (UniqueName: \"kubernetes.io/projected/1a4fa98c-05e4-48fe-93c7-d01bd593d03a-kube-api-access-8zj2j\") pod \"kube-storage-version-migrator-operator-6769c5d45-kqkjb\" (UID: \"1a4fa98c-05e4-48fe-93c7-d01bd593d03a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb" Apr 16 20:29:51.165084 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.165046 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-smjfv\" (UID: \"1d3dda55-4aed-4f36-8730-48e51f0a7145\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv" Apr 16 20:29:51.165084 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.165068 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/da561835-fd55-453b-91fa-23a89f82a5f3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tpq9c\" (UID: \"da561835-fd55-453b-91fa-23a89f82a5f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" Apr 16 20:29:51.165084 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.165086 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnkdl\" (UniqueName: \"kubernetes.io/projected/785129b2-7510-46a7-b033-97db9eb277cd-kube-api-access-rnkdl\") pod \"network-check-source-8894fc9bd-ps9ss\" (UID: \"785129b2-7510-46a7-b033-97db9eb277cd\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ps9ss" Apr 16 20:29:51.165399 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.165110 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e4283e6-7889-48d8-acdf-e35108f466bb-config\") pod \"service-ca-operator-d6fc45fc5-tdhc6\" (UID: \"1e4283e6-7889-48d8-acdf-e35108f466bb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6" Apr 16 20:29:51.165399 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.165126 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtpqc\" (UniqueName: \"kubernetes.io/projected/1e4283e6-7889-48d8-acdf-e35108f466bb-kube-api-access-gtpqc\") pod \"service-ca-operator-d6fc45fc5-tdhc6\" (UID: \"1e4283e6-7889-48d8-acdf-e35108f466bb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6" Apr 16 20:29:51.165399 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:51.165151 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:29:51.165399 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.165170 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpq9c\" (UID: \"da561835-fd55-453b-91fa-23a89f82a5f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" Apr 16 20:29:51.165399 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.165197 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4fa98c-05e4-48fe-93c7-d01bd593d03a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-kqkjb\" (UID: \"1a4fa98c-05e4-48fe-93c7-d01bd593d03a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb" Apr 16 20:29:51.165399 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:51.165227 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls podName:1d3dda55-4aed-4f36-8730-48e51f0a7145 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:51.665206211 +0000 UTC m=+138.934126613 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-smjfv" (UID: "1d3dda55-4aed-4f36-8730-48e51f0a7145") : secret "samples-operator-tls" not found Apr 16 20:29:51.165399 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:51.165286 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:29:51.165399 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.165305 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bc8x\" (UniqueName: \"kubernetes.io/projected/1d3dda55-4aed-4f36-8730-48e51f0a7145-kube-api-access-8bc8x\") pod \"cluster-samples-operator-6dc5bdb6b4-smjfv\" (UID: \"1d3dda55-4aed-4f36-8730-48e51f0a7145\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv" Apr 16 20:29:51.165399 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.165333 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a4fa98c-05e4-48fe-93c7-d01bd593d03a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-kqkjb\" (UID: \"1a4fa98c-05e4-48fe-93c7-d01bd593d03a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb" Apr 16 20:29:51.165399 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:51.165341 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls podName:da561835-fd55-453b-91fa-23a89f82a5f3 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:51.665328707 +0000 UTC m=+138.934249103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tpq9c" (UID: "da561835-fd55-453b-91fa-23a89f82a5f3") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:29:51.165399 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.165380 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lh7w8\" (UniqueName: \"kubernetes.io/projected/da561835-fd55-453b-91fa-23a89f82a5f3-kube-api-access-lh7w8\") pod \"cluster-monitoring-operator-75587bd455-tpq9c\" (UID: \"da561835-fd55-453b-91fa-23a89f82a5f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" Apr 16 20:29:51.165399 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.165408 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e4283e6-7889-48d8-acdf-e35108f466bb-serving-cert\") pod \"service-ca-operator-d6fc45fc5-tdhc6\" (UID: \"1e4283e6-7889-48d8-acdf-e35108f466bb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6" Apr 16 20:29:51.165897 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.165869 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/da561835-fd55-453b-91fa-23a89f82a5f3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tpq9c\" (UID: \"da561835-fd55-453b-91fa-23a89f82a5f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" Apr 16 20:29:51.174171 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.174128 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh7w8\" (UniqueName: \"kubernetes.io/projected/da561835-fd55-453b-91fa-23a89f82a5f3-kube-api-access-lh7w8\") pod \"cluster-monitoring-operator-75587bd455-tpq9c\" (UID: \"da561835-fd55-453b-91fa-23a89f82a5f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" Apr 16 20:29:51.174877 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.174860 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnkdl\" (UniqueName: \"kubernetes.io/projected/785129b2-7510-46a7-b033-97db9eb277cd-kube-api-access-rnkdl\") pod \"network-check-source-8894fc9bd-ps9ss\" (UID: \"785129b2-7510-46a7-b033-97db9eb277cd\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ps9ss" Apr 16 20:29:51.174954 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.174932 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bc8x\" (UniqueName: \"kubernetes.io/projected/1d3dda55-4aed-4f36-8730-48e51f0a7145-kube-api-access-8bc8x\") pod \"cluster-samples-operator-6dc5bdb6b4-smjfv\" (UID: \"1d3dda55-4aed-4f36-8730-48e51f0a7145\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv" Apr 16 20:29:51.257810 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.257764 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ps9ss" Apr 16 20:29:51.265807 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.265785 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4fa98c-05e4-48fe-93c7-d01bd593d03a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-kqkjb\" (UID: \"1a4fa98c-05e4-48fe-93c7-d01bd593d03a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb" Apr 16 20:29:51.265885 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.265830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a4fa98c-05e4-48fe-93c7-d01bd593d03a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-kqkjb\" (UID: \"1a4fa98c-05e4-48fe-93c7-d01bd593d03a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb" Apr 16 20:29:51.265973 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.265952 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e4283e6-7889-48d8-acdf-e35108f466bb-serving-cert\") pod \"service-ca-operator-d6fc45fc5-tdhc6\" (UID: \"1e4283e6-7889-48d8-acdf-e35108f466bb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6" Apr 16 20:29:51.266031 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.266007 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zj2j\" (UniqueName: \"kubernetes.io/projected/1a4fa98c-05e4-48fe-93c7-d01bd593d03a-kube-api-access-8zj2j\") pod \"kube-storage-version-migrator-operator-6769c5d45-kqkjb\" (UID: \"1a4fa98c-05e4-48fe-93c7-d01bd593d03a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb" Apr 16 20:29:51.266099 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.266087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e4283e6-7889-48d8-acdf-e35108f466bb-config\") pod \"service-ca-operator-d6fc45fc5-tdhc6\" (UID: \"1e4283e6-7889-48d8-acdf-e35108f466bb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6" Apr 16 20:29:51.266270 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.266206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtpqc\" (UniqueName: \"kubernetes.io/projected/1e4283e6-7889-48d8-acdf-e35108f466bb-kube-api-access-gtpqc\") pod \"service-ca-operator-d6fc45fc5-tdhc6\" (UID: \"1e4283e6-7889-48d8-acdf-e35108f466bb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6" Apr 16 20:29:51.266403 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.266342 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a4fa98c-05e4-48fe-93c7-d01bd593d03a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-kqkjb\" (UID: \"1a4fa98c-05e4-48fe-93c7-d01bd593d03a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb" Apr 16 20:29:51.266670 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.266646 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e4283e6-7889-48d8-acdf-e35108f466bb-config\") pod \"service-ca-operator-d6fc45fc5-tdhc6\" (UID: \"1e4283e6-7889-48d8-acdf-e35108f466bb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6" Apr 16 20:29:51.268129 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.268096 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e4283e6-7889-48d8-acdf-e35108f466bb-serving-cert\") pod \"service-ca-operator-d6fc45fc5-tdhc6\" (UID: \"1e4283e6-7889-48d8-acdf-e35108f466bb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6" Apr 16 20:29:51.268220 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.268184 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a4fa98c-05e4-48fe-93c7-d01bd593d03a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-kqkjb\" (UID: \"1a4fa98c-05e4-48fe-93c7-d01bd593d03a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb" Apr 16 20:29:51.276553 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.276532 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtpqc\" (UniqueName: \"kubernetes.io/projected/1e4283e6-7889-48d8-acdf-e35108f466bb-kube-api-access-gtpqc\") pod \"service-ca-operator-d6fc45fc5-tdhc6\" (UID: \"1e4283e6-7889-48d8-acdf-e35108f466bb\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6" Apr 16 20:29:51.276842 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.276818 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zj2j\" (UniqueName: \"kubernetes.io/projected/1a4fa98c-05e4-48fe-93c7-d01bd593d03a-kube-api-access-8zj2j\") pod \"kube-storage-version-migrator-operator-6769c5d45-kqkjb\" (UID: \"1a4fa98c-05e4-48fe-93c7-d01bd593d03a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb" Apr 16 20:29:51.348604 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.348578 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6" Apr 16 20:29:51.354285 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.354263 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb" Apr 16 20:29:51.366681 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.366658 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-ps9ss"] Apr 16 20:29:51.370350 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:29:51.370320 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod785129b2_7510_46a7_b033_97db9eb277cd.slice/crio-0fc7003f929c571f8b014f4e409ef1e95111de193586031ab5aa22f71bf5af56 WatchSource:0}: Error finding container 0fc7003f929c571f8b014f4e409ef1e95111de193586031ab5aa22f71bf5af56: Status 404 returned error can't find the container with id 0fc7003f929c571f8b014f4e409ef1e95111de193586031ab5aa22f71bf5af56 Apr 16 20:29:51.472319 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.472285 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6"] Apr 16 20:29:51.475793 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:29:51.475762 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e4283e6_7889_48d8_acdf_e35108f466bb.slice/crio-81ff22c789580b04540f3791de670908572852ce616f5efdba3f792acdc19fb3 WatchSource:0}: Error finding container 81ff22c789580b04540f3791de670908572852ce616f5efdba3f792acdc19fb3: Status 404 returned error can't find the container with id 81ff22c789580b04540f3791de670908572852ce616f5efdba3f792acdc19fb3 Apr 16 20:29:51.495085 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.495059 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb"] Apr 16 20:29:51.499170 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:29:51.499138 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a4fa98c_05e4_48fe_93c7_d01bd593d03a.slice/crio-fa93c964edae5dedf6dbb8fabdeaf900bf02701f2514232b831e67dcd5475305 WatchSource:0}: Error finding container fa93c964edae5dedf6dbb8fabdeaf900bf02701f2514232b831e67dcd5475305: Status 404 returned error can't find the container with id fa93c964edae5dedf6dbb8fabdeaf900bf02701f2514232b831e67dcd5475305 Apr 16 20:29:51.568197 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.568168 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:51.568342 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.568213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:51.568400 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:51.568334 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:29:51.568450 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:51.568413 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs podName:b0125b55-3e0c-4bba-b620-3460a3974959 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:52.568388632 +0000 UTC m=+139.837309024 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs") pod "router-default-cc5b6bc9d-d9pl5" (UID: "b0125b55-3e0c-4bba-b620-3460a3974959") : secret "router-metrics-certs-default" not found Apr 16 20:29:51.568549 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:51.568475 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle podName:b0125b55-3e0c-4bba-b620-3460a3974959 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:52.568447284 +0000 UTC m=+139.837367672 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle") pod "router-default-cc5b6bc9d-d9pl5" (UID: "b0125b55-3e0c-4bba-b620-3460a3974959") : configmap references non-existent config key: service-ca.crt Apr 16 20:29:51.669361 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.669260 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpq9c\" (UID: \"da561835-fd55-453b-91fa-23a89f82a5f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" Apr 16 20:29:51.669538 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.669378 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-smjfv\" (UID: \"1d3dda55-4aed-4f36-8730-48e51f0a7145\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv" Apr 16 20:29:51.669538 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:51.669419 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:29:51.669538 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:51.669522 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls podName:da561835-fd55-453b-91fa-23a89f82a5f3 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:52.669499139 +0000 UTC m=+139.938419540 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tpq9c" (UID: "da561835-fd55-453b-91fa-23a89f82a5f3") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:29:51.669672 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:51.669536 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:29:51.669672 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:51.669585 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls podName:1d3dda55-4aed-4f36-8730-48e51f0a7145 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:52.669569852 +0000 UTC m=+139.938490258 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-smjfv" (UID: "1d3dda55-4aed-4f36-8730-48e51f0a7145") : secret "samples-operator-tls" not found Apr 16 20:29:51.814158 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.814109 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ps9ss" event={"ID":"785129b2-7510-46a7-b033-97db9eb277cd","Type":"ContainerStarted","Data":"f6daed1dde2736838c3efece2d8c03e01713ee580e923510815daa409e313e91"} Apr 16 20:29:51.814158 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.814159 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ps9ss" event={"ID":"785129b2-7510-46a7-b033-97db9eb277cd","Type":"ContainerStarted","Data":"0fc7003f929c571f8b014f4e409ef1e95111de193586031ab5aa22f71bf5af56"} Apr 16 20:29:51.815241 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.815212 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6" event={"ID":"1e4283e6-7889-48d8-acdf-e35108f466bb","Type":"ContainerStarted","Data":"81ff22c789580b04540f3791de670908572852ce616f5efdba3f792acdc19fb3"} Apr 16 20:29:51.816174 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.816152 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb" event={"ID":"1a4fa98c-05e4-48fe-93c7-d01bd593d03a","Type":"ContainerStarted","Data":"fa93c964edae5dedf6dbb8fabdeaf900bf02701f2514232b831e67dcd5475305"} Apr 16 20:29:51.830334 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:51.830289 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ps9ss" podStartSLOduration=1.830276485 podStartE2EDuration="1.830276485s" podCreationTimestamp="2026-04-16 20:29:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:29:51.829766203 +0000 UTC m=+139.098686615" watchObservedRunningTime="2026-04-16 20:29:51.830276485 +0000 UTC m=+139.099196895" Apr 16 20:29:52.577811 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:52.577758 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:52.578416 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:52.577906 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:52.578416 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:52.577966 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle podName:b0125b55-3e0c-4bba-b620-3460a3974959 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:54.57794165 +0000 UTC m=+141.846862044 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle") pod "router-default-cc5b6bc9d-d9pl5" (UID: "b0125b55-3e0c-4bba-b620-3460a3974959") : configmap references non-existent config key: service-ca.crt Apr 16 20:29:52.578416 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:52.578028 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:29:52.578416 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:52.578083 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs podName:b0125b55-3e0c-4bba-b620-3460a3974959 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:54.578066503 +0000 UTC m=+141.846986892 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs") pod "router-default-cc5b6bc9d-d9pl5" (UID: "b0125b55-3e0c-4bba-b620-3460a3974959") : secret "router-metrics-certs-default" not found Apr 16 20:29:52.679255 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:52.679199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpq9c\" (UID: \"da561835-fd55-453b-91fa-23a89f82a5f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" Apr 16 20:29:52.679420 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:52.679333 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-smjfv\" (UID: \"1d3dda55-4aed-4f36-8730-48e51f0a7145\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv" Apr 16 20:29:52.679420 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:52.679354 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:29:52.679582 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:52.679437 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls podName:da561835-fd55-453b-91fa-23a89f82a5f3 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:54.679413017 +0000 UTC m=+141.948333428 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tpq9c" (UID: "da561835-fd55-453b-91fa-23a89f82a5f3") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:29:52.679582 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:52.679535 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:29:52.679681 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:52.679598 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls podName:1d3dda55-4aed-4f36-8730-48e51f0a7145 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:54.679581562 +0000 UTC m=+141.948501964 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-smjfv" (UID: "1d3dda55-4aed-4f36-8730-48e51f0a7145") : secret "samples-operator-tls" not found Apr 16 20:29:54.593192 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:54.593152 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:54.593601 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:54.593206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:54.593601 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:54.593323 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:29:54.593601 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:54.593344 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle podName:b0125b55-3e0c-4bba-b620-3460a3974959 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:58.593330528 +0000 UTC m=+145.862250916 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle") pod "router-default-cc5b6bc9d-d9pl5" (UID: "b0125b55-3e0c-4bba-b620-3460a3974959") : configmap references non-existent config key: service-ca.crt Apr 16 20:29:54.593601 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:54.593391 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs podName:b0125b55-3e0c-4bba-b620-3460a3974959 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:58.593373265 +0000 UTC m=+145.862293659 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs") pod "router-default-cc5b6bc9d-d9pl5" (UID: "b0125b55-3e0c-4bba-b620-3460a3974959") : secret "router-metrics-certs-default" not found Apr 16 20:29:54.694445 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:54.694347 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-smjfv\" (UID: \"1d3dda55-4aed-4f36-8730-48e51f0a7145\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv" Apr 16 20:29:54.694631 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:54.694528 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:29:54.694631 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:54.694614 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls podName:1d3dda55-4aed-4f36-8730-48e51f0a7145 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:58.69458793 +0000 UTC m=+145.963508319 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-smjfv" (UID: "1d3dda55-4aed-4f36-8730-48e51f0a7145") : secret "samples-operator-tls" not found Apr 16 20:29:54.694631 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:54.694600 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpq9c\" (UID: \"da561835-fd55-453b-91fa-23a89f82a5f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" Apr 16 20:29:54.694798 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:54.694774 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:29:54.694915 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:54.694903 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls podName:da561835-fd55-453b-91fa-23a89f82a5f3 nodeName:}" failed. No retries permitted until 2026-04-16 20:29:58.694885805 +0000 UTC m=+145.963806200 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tpq9c" (UID: "da561835-fd55-453b-91fa-23a89f82a5f3") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:29:54.827064 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:54.827024 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6" event={"ID":"1e4283e6-7889-48d8-acdf-e35108f466bb","Type":"ContainerStarted","Data":"573b1d76500b6f4044c4ec14c7d95fd1ba7f5e08efc19e21f4cf83c95fca93b8"} Apr 16 20:29:54.828347 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:54.828321 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb" event={"ID":"1a4fa98c-05e4-48fe-93c7-d01bd593d03a","Type":"ContainerStarted","Data":"35414d1a433246b72be9d1f8f079d3d0893d78b4dd98e2bec641ba486bdfa4c6"} Apr 16 20:29:54.840959 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:54.840900 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6" podStartSLOduration=0.884089351 podStartE2EDuration="3.840884681s" podCreationTimestamp="2026-04-16 20:29:51 +0000 UTC" firstStartedPulling="2026-04-16 20:29:51.477510822 +0000 UTC m=+138.746431209" lastFinishedPulling="2026-04-16 20:29:54.434306134 +0000 UTC m=+141.703226539" observedRunningTime="2026-04-16 20:29:54.840850402 +0000 UTC m=+142.109770812" watchObservedRunningTime="2026-04-16 20:29:54.840884681 +0000 UTC m=+142.109805093" Apr 16 20:29:54.856517 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:54.856452 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb" podStartSLOduration=0.935230085 podStartE2EDuration="3.856408787s" podCreationTimestamp="2026-04-16 20:29:51 +0000 UTC" firstStartedPulling="2026-04-16 20:29:51.515643822 +0000 UTC m=+138.784564211" lastFinishedPulling="2026-04-16 20:29:54.436822525 +0000 UTC m=+141.705742913" observedRunningTime="2026-04-16 20:29:54.856349936 +0000 UTC m=+142.125270344" watchObservedRunningTime="2026-04-16 20:29:54.856408787 +0000 UTC m=+142.125329194" Apr 16 20:29:56.862171 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:56.862137 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jk4m8_41119468-3774-48bd-98be-d49ab3625162/dns-node-resolver/0.log" Apr 16 20:29:57.861452 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:57.861423 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-m848r_8df02eba-eb01-4603-87bd-76a281217485/node-ca/0.log" Apr 16 20:29:58.625033 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:58.625002 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:58.625420 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:58.625052 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:29:58.625420 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:58.625139 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:29:58.625420 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:58.625151 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle podName:b0125b55-3e0c-4bba-b620-3460a3974959 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:06.625138598 +0000 UTC m=+153.894058986 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle") pod "router-default-cc5b6bc9d-d9pl5" (UID: "b0125b55-3e0c-4bba-b620-3460a3974959") : configmap references non-existent config key: service-ca.crt Apr 16 20:29:58.625420 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:58.625190 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs podName:b0125b55-3e0c-4bba-b620-3460a3974959 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:06.625175871 +0000 UTC m=+153.894096259 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs") pod "router-default-cc5b6bc9d-d9pl5" (UID: "b0125b55-3e0c-4bba-b620-3460a3974959") : secret "router-metrics-certs-default" not found Apr 16 20:29:58.725960 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:58.725926 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-smjfv\" (UID: \"1d3dda55-4aed-4f36-8730-48e51f0a7145\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv" Apr 16 20:29:58.726131 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:29:58.726035 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpq9c\" (UID: \"da561835-fd55-453b-91fa-23a89f82a5f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" Apr 16 20:29:58.726131 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:58.726069 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:29:58.726228 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:58.726132 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:29:58.726228 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:58.726135 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls podName:1d3dda55-4aed-4f36-8730-48e51f0a7145 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:06.726118079 +0000 UTC m=+153.995038468 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-smjfv" (UID: "1d3dda55-4aed-4f36-8730-48e51f0a7145") : secret "samples-operator-tls" not found Apr 16 20:29:58.726228 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:29:58.726186 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls podName:da561835-fd55-453b-91fa-23a89f82a5f3 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:06.726174216 +0000 UTC m=+153.995094603 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tpq9c" (UID: "da561835-fd55-453b-91fa-23a89f82a5f3") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:30:06.686128 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:06.686093 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:30:06.686568 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:06.686147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:30:06.686568 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:30:06.686269 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:30:06.686568 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:30:06.686288 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle podName:b0125b55-3e0c-4bba-b620-3460a3974959 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:22.686273605 +0000 UTC m=+169.955193993 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle") pod "router-default-cc5b6bc9d-d9pl5" (UID: "b0125b55-3e0c-4bba-b620-3460a3974959") : configmap references non-existent config key: service-ca.crt Apr 16 20:30:06.686568 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:30:06.686327 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs podName:b0125b55-3e0c-4bba-b620-3460a3974959 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:22.686313673 +0000 UTC m=+169.955234062 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs") pod "router-default-cc5b6bc9d-d9pl5" (UID: "b0125b55-3e0c-4bba-b620-3460a3974959") : secret "router-metrics-certs-default" not found Apr 16 20:30:06.786744 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:06.786696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-smjfv\" (UID: \"1d3dda55-4aed-4f36-8730-48e51f0a7145\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv" Apr 16 20:30:06.786919 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:06.786802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpq9c\" (UID: \"da561835-fd55-453b-91fa-23a89f82a5f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" Apr 16 20:30:06.786983 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:30:06.786966 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:30:06.787063 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:30:06.787051 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls podName:da561835-fd55-453b-91fa-23a89f82a5f3 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:22.787029183 +0000 UTC m=+170.055949577 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tpq9c" (UID: "da561835-fd55-453b-91fa-23a89f82a5f3") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:30:06.789081 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:06.789058 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d3dda55-4aed-4f36-8730-48e51f0a7145-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-smjfv\" (UID: \"1d3dda55-4aed-4f36-8730-48e51f0a7145\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv" Apr 16 20:30:06.864626 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:06.864592 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv" Apr 16 20:30:06.977094 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:06.977062 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv"] Apr 16 20:30:07.667119 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:30:07.667075 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-xbszw" podUID="a2a1535c-c8dc-4688-a07f-00a01b4dec34" Apr 16 20:30:07.685259 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:30:07.685220 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-skwc8" podUID="f1de9e75-c8b2-4fee-898a-82488ff8d677" Apr 16 20:30:07.858846 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:07.858819 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-skwc8" Apr 16 20:30:07.859203 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:07.858817 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv" event={"ID":"1d3dda55-4aed-4f36-8730-48e51f0a7145","Type":"ContainerStarted","Data":"7c0e62ee105f1021224ab32dccb011089a8deda1a9684713df5ba940e8667a89"} Apr 16 20:30:07.859203 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:07.858933 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xbszw" Apr 16 20:30:09.291242 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:30:09.291202 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-b8p9v" podUID="30274609-546d-4c7b-abd0-8907fd0a6cd7" Apr 16 20:30:09.864572 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:09.864482 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv" event={"ID":"1d3dda55-4aed-4f36-8730-48e51f0a7145","Type":"ContainerStarted","Data":"8ad94f7a5940e0b46a977b1d9741670a11fd13619929a4782d964971da21c8b3"} Apr 16 20:30:09.864572 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:09.864513 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv" event={"ID":"1d3dda55-4aed-4f36-8730-48e51f0a7145","Type":"ContainerStarted","Data":"2717578b44cad4694b112ed655712b00445ee6b54d2585d10ba40a0432de2b6f"} Apr 16 20:30:09.879911 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:09.879864 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-smjfv" podStartSLOduration=18.073600871 podStartE2EDuration="19.879852103s" podCreationTimestamp="2026-04-16 20:29:50 +0000 UTC" firstStartedPulling="2026-04-16 20:30:07.0254835 +0000 UTC m=+154.294403888" lastFinishedPulling="2026-04-16 20:30:08.831734716 +0000 UTC m=+156.100655120" observedRunningTime="2026-04-16 20:30:09.87927196 +0000 UTC m=+157.148192369" watchObservedRunningTime="2026-04-16 20:30:09.879852103 +0000 UTC m=+157.148772543" Apr 16 20:30:12.627895 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:12.627858 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:30:12.627895 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:12.627911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert\") pod \"ingress-canary-skwc8\" (UID: \"f1de9e75-c8b2-4fee-898a-82488ff8d677\") " pod="openshift-ingress-canary/ingress-canary-skwc8" Apr 16 20:30:12.630162 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:12.630134 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2a1535c-c8dc-4688-a07f-00a01b4dec34-metrics-tls\") pod \"dns-default-xbszw\" (UID: \"a2a1535c-c8dc-4688-a07f-00a01b4dec34\") " pod="openshift-dns/dns-default-xbszw" Apr 16 20:30:12.630305 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:12.630286 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1de9e75-c8b2-4fee-898a-82488ff8d677-cert\") pod \"ingress-canary-skwc8\" (UID: \"f1de9e75-c8b2-4fee-898a-82488ff8d677\") " pod="openshift-ingress-canary/ingress-canary-skwc8" Apr 16 20:30:12.661671 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:12.661643 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w999j\"" Apr 16 20:30:12.662704 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:12.662688 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2cgzs\"" Apr 16 20:30:12.669536 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:12.669512 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xbszw" Apr 16 20:30:12.669992 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:12.669977 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-skwc8" Apr 16 20:30:12.802220 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:12.802183 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xbszw"] Apr 16 20:30:12.806153 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:30:12.806126 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2a1535c_c8dc_4688_a07f_00a01b4dec34.slice/crio-6469d60da7c4959cab548c2e64e7fa1842805eb57d34a3d0d1ab42ce7f29c24f WatchSource:0}: Error finding container 6469d60da7c4959cab548c2e64e7fa1842805eb57d34a3d0d1ab42ce7f29c24f: Status 404 returned error can't find the container with id 6469d60da7c4959cab548c2e64e7fa1842805eb57d34a3d0d1ab42ce7f29c24f Apr 16 20:30:12.816683 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:12.816660 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-skwc8"] Apr 16 20:30:12.819509 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:30:12.819483 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1de9e75_c8b2_4fee_898a_82488ff8d677.slice/crio-d7bdb3328aec042b7e5128f59a07cd667fd4bb25803dab418cfa7d2bfca1a2c8 WatchSource:0}: Error finding container d7bdb3328aec042b7e5128f59a07cd667fd4bb25803dab418cfa7d2bfca1a2c8: Status 404 returned error can't find the container with id d7bdb3328aec042b7e5128f59a07cd667fd4bb25803dab418cfa7d2bfca1a2c8 Apr 16 20:30:12.872624 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:12.872589 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-skwc8" event={"ID":"f1de9e75-c8b2-4fee-898a-82488ff8d677","Type":"ContainerStarted","Data":"d7bdb3328aec042b7e5128f59a07cd667fd4bb25803dab418cfa7d2bfca1a2c8"} Apr 16 20:30:12.873490 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:12.873454 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xbszw" event={"ID":"a2a1535c-c8dc-4688-a07f-00a01b4dec34","Type":"ContainerStarted","Data":"6469d60da7c4959cab548c2e64e7fa1842805eb57d34a3d0d1ab42ce7f29c24f"} Apr 16 20:30:14.881111 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:14.881048 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xbszw" event={"ID":"a2a1535c-c8dc-4688-a07f-00a01b4dec34","Type":"ContainerStarted","Data":"9e7c5bf959f813549bb012f279f7837ef8e249df7ad9623d41dd632172dc028c"} Apr 16 20:30:14.882796 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:14.882755 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-skwc8" event={"ID":"f1de9e75-c8b2-4fee-898a-82488ff8d677","Type":"ContainerStarted","Data":"373d01056fea035035afb252687eae74578cb9128315a4b8b781f2ab3f626b5d"} Apr 16 20:30:14.897584 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:14.897542 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-skwc8" podStartSLOduration=129.028385352 podStartE2EDuration="2m10.897528787s" podCreationTimestamp="2026-04-16 20:28:04 +0000 UTC" firstStartedPulling="2026-04-16 20:30:12.821417104 +0000 UTC m=+160.090337512" lastFinishedPulling="2026-04-16 20:30:14.690560559 +0000 UTC m=+161.959480947" observedRunningTime="2026-04-16 20:30:14.896687894 +0000 UTC m=+162.165608317" watchObservedRunningTime="2026-04-16 20:30:14.897528787 +0000 UTC m=+162.166449196" Apr 16 20:30:15.887264 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:15.887188 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xbszw" event={"ID":"a2a1535c-c8dc-4688-a07f-00a01b4dec34","Type":"ContainerStarted","Data":"a09959f4fb7c564a2cb6e4cdf684fc52c6a210b49e386d45ab663f0df750d87b"} Apr 16 20:30:15.903899 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:15.903850 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xbszw" podStartSLOduration=130.024565955 podStartE2EDuration="2m11.903835912s" podCreationTimestamp="2026-04-16 20:28:04 +0000 UTC" firstStartedPulling="2026-04-16 20:30:12.808049856 +0000 UTC m=+160.076970245" lastFinishedPulling="2026-04-16 20:30:14.687319802 +0000 UTC m=+161.956240202" observedRunningTime="2026-04-16 20:30:15.902719775 +0000 UTC m=+163.171640185" watchObservedRunningTime="2026-04-16 20:30:15.903835912 +0000 UTC m=+163.172756321" Apr 16 20:30:16.890046 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:16.890007 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xbszw" Apr 16 20:30:18.283760 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.283725 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rrklb"] Apr 16 20:30:18.289277 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.289247 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rrklb" Apr 16 20:30:18.291644 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.291621 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 20:30:18.291743 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.291624 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 20:30:18.292760 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.292736 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 20:30:18.292855 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.292761 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-78bvs\"" Apr 16 20:30:18.292855 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.292794 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 20:30:18.298386 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.298345 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rrklb"] Apr 16 20:30:18.320570 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.320543 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7c9d75854d-g48xr"] Apr 16 20:30:18.323409 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.323392 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.325870 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.325848 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 20:30:18.325983 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.325894 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tj6tv\"" Apr 16 20:30:18.325983 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.325959 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 20:30:18.326169 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.326154 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 20:30:18.330876 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.330806 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 20:30:18.334954 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.334935 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7c9d75854d-g48xr"] Apr 16 20:30:18.371530 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.371497 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bab34410-b402-43af-930c-05dbd9430ae8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rrklb\" (UID: \"bab34410-b402-43af-930c-05dbd9430ae8\") " pod="openshift-insights/insights-runtime-extractor-rrklb" Apr 16 20:30:18.371708 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.371562 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9h5f\" (UniqueName: \"kubernetes.io/projected/bab34410-b402-43af-930c-05dbd9430ae8-kube-api-access-n9h5f\") pod \"insights-runtime-extractor-rrklb\" (UID: \"bab34410-b402-43af-930c-05dbd9430ae8\") " pod="openshift-insights/insights-runtime-extractor-rrklb" Apr 16 20:30:18.371708 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.371634 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bab34410-b402-43af-930c-05dbd9430ae8-crio-socket\") pod \"insights-runtime-extractor-rrklb\" (UID: \"bab34410-b402-43af-930c-05dbd9430ae8\") " pod="openshift-insights/insights-runtime-extractor-rrklb" Apr 16 20:30:18.371708 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.371675 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-image-registry-private-configuration\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.371708 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.371698 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-ca-trust-extracted\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.371883 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.371740 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-installation-pull-secrets\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.371883 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.371768 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-bound-sa-token\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.371883 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.371794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-registry-certificates\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.371883 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.371836 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-trusted-ca\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.371883 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.371859 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bab34410-b402-43af-930c-05dbd9430ae8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rrklb\" (UID: \"bab34410-b402-43af-930c-05dbd9430ae8\") " pod="openshift-insights/insights-runtime-extractor-rrklb" Apr 16 20:30:18.371883 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.371876 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4262\" (UniqueName: \"kubernetes.io/projected/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-kube-api-access-h4262\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.372076 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.371894 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bab34410-b402-43af-930c-05dbd9430ae8-data-volume\") pod \"insights-runtime-extractor-rrklb\" (UID: \"bab34410-b402-43af-930c-05dbd9430ae8\") " pod="openshift-insights/insights-runtime-extractor-rrklb" Apr 16 20:30:18.372076 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.371913 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-registry-tls\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.472248 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.472211 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bab34410-b402-43af-930c-05dbd9430ae8-data-volume\") pod \"insights-runtime-extractor-rrklb\" (UID: \"bab34410-b402-43af-930c-05dbd9430ae8\") " pod="openshift-insights/insights-runtime-extractor-rrklb" Apr 16 20:30:18.472248 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.472245 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-registry-tls\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.472506 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.472280 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bab34410-b402-43af-930c-05dbd9430ae8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rrklb\" (UID: \"bab34410-b402-43af-930c-05dbd9430ae8\") " pod="openshift-insights/insights-runtime-extractor-rrklb" Apr 16 20:30:18.472506 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.472317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9h5f\" (UniqueName: \"kubernetes.io/projected/bab34410-b402-43af-930c-05dbd9430ae8-kube-api-access-n9h5f\") pod \"insights-runtime-extractor-rrklb\" (UID: \"bab34410-b402-43af-930c-05dbd9430ae8\") " pod="openshift-insights/insights-runtime-extractor-rrklb" Apr 16 20:30:18.472506 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.472336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bab34410-b402-43af-930c-05dbd9430ae8-crio-socket\") pod \"insights-runtime-extractor-rrklb\" (UID: \"bab34410-b402-43af-930c-05dbd9430ae8\") " pod="openshift-insights/insights-runtime-extractor-rrklb" Apr 16 20:30:18.472506 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.472356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-image-registry-private-configuration\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.472506 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.472376 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-ca-trust-extracted\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.472506 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.472486 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bab34410-b402-43af-930c-05dbd9430ae8-crio-socket\") pod \"insights-runtime-extractor-rrklb\" (UID: \"bab34410-b402-43af-930c-05dbd9430ae8\") " pod="openshift-insights/insights-runtime-extractor-rrklb" Apr 16 20:30:18.472822 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.472548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-installation-pull-secrets\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.472822 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.472587 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-bound-sa-token\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.472822 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.472611 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-registry-certificates\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.472822 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.472647 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-trusted-ca\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.473043 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.472850 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bab34410-b402-43af-930c-05dbd9430ae8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rrklb\" (UID: \"bab34410-b402-43af-930c-05dbd9430ae8\") " pod="openshift-insights/insights-runtime-extractor-rrklb" Apr 16 20:30:18.473043 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.472903 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4262\" (UniqueName: \"kubernetes.io/projected/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-kube-api-access-h4262\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.473307 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.473284 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-ca-trust-extracted\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.473387 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.473366 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bab34410-b402-43af-930c-05dbd9430ae8-data-volume\") pod \"insights-runtime-extractor-rrklb\" (UID: \"bab34410-b402-43af-930c-05dbd9430ae8\") " pod="openshift-insights/insights-runtime-extractor-rrklb" Apr 16 20:30:18.473447 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.473398 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bab34410-b402-43af-930c-05dbd9430ae8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rrklb\" (UID: \"bab34410-b402-43af-930c-05dbd9430ae8\") " pod="openshift-insights/insights-runtime-extractor-rrklb" Apr 16 20:30:18.474310 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.474282 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-trusted-ca\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.474629 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.474602 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-registry-certificates\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.475126 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.475099 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bab34410-b402-43af-930c-05dbd9430ae8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rrklb\" (UID: \"bab34410-b402-43af-930c-05dbd9430ae8\") " pod="openshift-insights/insights-runtime-extractor-rrklb" Apr 16 20:30:18.475197 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.475125 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-installation-pull-secrets\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.475197 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.475175 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-registry-tls\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.475197 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.475180 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-image-registry-private-configuration\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.480453 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.480425 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9h5f\" (UniqueName: \"kubernetes.io/projected/bab34410-b402-43af-930c-05dbd9430ae8-kube-api-access-n9h5f\") pod \"insights-runtime-extractor-rrklb\" (UID: \"bab34410-b402-43af-930c-05dbd9430ae8\") " pod="openshift-insights/insights-runtime-extractor-rrklb" Apr 16 20:30:18.480648 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.480629 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-bound-sa-token\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.480915 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.480893 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4262\" (UniqueName: \"kubernetes.io/projected/8d5a6299-5ec2-4b7a-9105-5e52be9dc830-kube-api-access-h4262\") pod \"image-registry-7c9d75854d-g48xr\" (UID: \"8d5a6299-5ec2-4b7a-9105-5e52be9dc830\") " pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.599105 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.599031 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rrklb" Apr 16 20:30:18.633793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.633763 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.734866 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.734814 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rrklb"] Apr 16 20:30:18.739555 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:30:18.739522 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbab34410_b402_43af_930c_05dbd9430ae8.slice/crio-3b4e861b5664873a81d31cdeb619616efc0e31f95de290a619706a60a5ac2e5f WatchSource:0}: Error finding container 3b4e861b5664873a81d31cdeb619616efc0e31f95de290a619706a60a5ac2e5f: Status 404 returned error can't find the container with id 3b4e861b5664873a81d31cdeb619616efc0e31f95de290a619706a60a5ac2e5f Apr 16 20:30:18.784798 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.784773 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7c9d75854d-g48xr"] Apr 16 20:30:18.786378 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:30:18.786351 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d5a6299_5ec2_4b7a_9105_5e52be9dc830.slice/crio-521a4a70d34b48f053acd46101cf5a20aaeeaeb2124e3928aa4f2f7866ee0fb2 WatchSource:0}: Error finding container 521a4a70d34b48f053acd46101cf5a20aaeeaeb2124e3928aa4f2f7866ee0fb2: Status 404 returned error can't find the container with id 521a4a70d34b48f053acd46101cf5a20aaeeaeb2124e3928aa4f2f7866ee0fb2 Apr 16 20:30:18.896557 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.896446 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" event={"ID":"8d5a6299-5ec2-4b7a-9105-5e52be9dc830","Type":"ContainerStarted","Data":"a47e85da3a946aada5c77c609e3e4cfe41e53853683d69fe1b0fc1c4e4c6db32"} Apr 16 20:30:18.896557 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.896504 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" event={"ID":"8d5a6299-5ec2-4b7a-9105-5e52be9dc830","Type":"ContainerStarted","Data":"521a4a70d34b48f053acd46101cf5a20aaeeaeb2124e3928aa4f2f7866ee0fb2"} Apr 16 20:30:18.896777 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.896591 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:18.897752 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.897726 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rrklb" event={"ID":"bab34410-b402-43af-930c-05dbd9430ae8","Type":"ContainerStarted","Data":"859726b1d67d9bb5b592cd5854ebfaab5edf2ae17c5923fa2eddd0ad35486071"} Apr 16 20:30:18.897752 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.897754 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rrklb" event={"ID":"bab34410-b402-43af-930c-05dbd9430ae8","Type":"ContainerStarted","Data":"3b4e861b5664873a81d31cdeb619616efc0e31f95de290a619706a60a5ac2e5f"} Apr 16 20:30:18.913744 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:18.913693 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" podStartSLOduration=0.913676375 podStartE2EDuration="913.676375ms" podCreationTimestamp="2026-04-16 20:30:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:30:18.913170768 +0000 UTC m=+166.182091178" watchObservedRunningTime="2026-04-16 20:30:18.913676375 +0000 UTC m=+166.182596787" Apr 16 20:30:19.902522 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:19.902415 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rrklb" event={"ID":"bab34410-b402-43af-930c-05dbd9430ae8","Type":"ContainerStarted","Data":"cfc539fcc775f2d281a7e41639d761b2c82e58b1170d62f410d6f9cedd8b10f2"} Apr 16 20:30:21.909264 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:21.909221 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rrklb" event={"ID":"bab34410-b402-43af-930c-05dbd9430ae8","Type":"ContainerStarted","Data":"805d69724707cb96d5d0eddf03d29b9db1c49b454b492def484f027aeac2c7a2"} Apr 16 20:30:21.926433 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:21.926383 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rrklb" podStartSLOduration=1.760423287 podStartE2EDuration="3.926370297s" podCreationTimestamp="2026-04-16 20:30:18 +0000 UTC" firstStartedPulling="2026-04-16 20:30:18.796415275 +0000 UTC m=+166.065335663" lastFinishedPulling="2026-04-16 20:30:20.962362281 +0000 UTC m=+168.231282673" observedRunningTime="2026-04-16 20:30:21.924901746 +0000 UTC m=+169.193822156" watchObservedRunningTime="2026-04-16 20:30:21.926370297 +0000 UTC m=+169.195290707" Apr 16 20:30:22.704922 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:22.704876 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:30:22.705087 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:22.704964 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:30:22.705615 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:22.705593 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0125b55-3e0c-4bba-b620-3460a3974959-service-ca-bundle\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:30:22.707246 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:22.707219 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0125b55-3e0c-4bba-b620-3460a3974959-metrics-certs\") pod \"router-default-cc5b6bc9d-d9pl5\" (UID: \"b0125b55-3e0c-4bba-b620-3460a3974959\") " pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:30:22.805626 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:22.805584 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpq9c\" (UID: \"da561835-fd55-453b-91fa-23a89f82a5f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" Apr 16 20:30:22.807980 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:22.807956 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/da561835-fd55-453b-91fa-23a89f82a5f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpq9c\" (UID: \"da561835-fd55-453b-91fa-23a89f82a5f3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" Apr 16 20:30:22.949614 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:22.949571 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:30:23.069639 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:23.069603 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" Apr 16 20:30:23.070108 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:23.070081 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-cc5b6bc9d-d9pl5"] Apr 16 20:30:23.072809 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:30:23.072783 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0125b55_3e0c_4bba_b620_3460a3974959.slice/crio-6ad9efc9bc32811af5192a4d1666868d6c060a14ea2198fd09c7e3303eb879a7 WatchSource:0}: Error finding container 6ad9efc9bc32811af5192a4d1666868d6c060a14ea2198fd09c7e3303eb879a7: Status 404 returned error can't find the container with id 6ad9efc9bc32811af5192a4d1666868d6c060a14ea2198fd09c7e3303eb879a7 Apr 16 20:30:23.193572 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:23.193543 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c"] Apr 16 20:30:23.196827 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:30:23.196803 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda561835_fd55_453b_91fa_23a89f82a5f3.slice/crio-22a94137c1d49569493024c92e073f50678bc3756f876272bdb420dc260dabdb WatchSource:0}: Error finding container 22a94137c1d49569493024c92e073f50678bc3756f876272bdb420dc260dabdb: Status 404 returned error can't find the container with id 22a94137c1d49569493024c92e073f50678bc3756f876272bdb420dc260dabdb Apr 16 20:30:23.915962 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:23.915922 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" event={"ID":"da561835-fd55-453b-91fa-23a89f82a5f3","Type":"ContainerStarted","Data":"22a94137c1d49569493024c92e073f50678bc3756f876272bdb420dc260dabdb"} Apr 16 20:30:23.917077 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:23.917052 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" event={"ID":"b0125b55-3e0c-4bba-b620-3460a3974959","Type":"ContainerStarted","Data":"f207e98bd4066a075b1d919185a87530c7db53abea3bde3c4bf1c82f4e594ec4"} Apr 16 20:30:23.917213 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:23.917084 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" event={"ID":"b0125b55-3e0c-4bba-b620-3460a3974959","Type":"ContainerStarted","Data":"6ad9efc9bc32811af5192a4d1666868d6c060a14ea2198fd09c7e3303eb879a7"} Apr 16 20:30:23.934694 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:23.934655 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" podStartSLOduration=33.934641951 podStartE2EDuration="33.934641951s" podCreationTimestamp="2026-04-16 20:29:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:30:23.933874392 +0000 UTC m=+171.202794802" watchObservedRunningTime="2026-04-16 20:30:23.934641951 +0000 UTC m=+171.203562376" Apr 16 20:30:23.950498 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:23.950454 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:30:23.952926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:23.952907 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:30:24.279906 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:24.279826 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:30:24.920288 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:24.920253 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:30:24.921676 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:24.921653 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-cc5b6bc9d-d9pl5" Apr 16 20:30:25.672691 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:25.672662 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zkfvp"] Apr 16 20:30:25.675710 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:25.675694 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zkfvp" Apr 16 20:30:25.678026 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:25.677999 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 20:30:25.678142 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:25.678104 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-2srs2\"" Apr 16 20:30:25.682599 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:25.682559 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zkfvp"] Apr 16 20:30:25.727053 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:25.727019 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/28ba2caa-20e5-4406-8427-83e5469c3854-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zkfvp\" (UID: \"28ba2caa-20e5-4406-8427-83e5469c3854\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zkfvp" Apr 16 20:30:25.827535 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:25.827495 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/28ba2caa-20e5-4406-8427-83e5469c3854-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zkfvp\" (UID: \"28ba2caa-20e5-4406-8427-83e5469c3854\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zkfvp" Apr 16 20:30:25.827669 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:30:25.827626 2577 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 20:30:25.827734 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:30:25.827695 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28ba2caa-20e5-4406-8427-83e5469c3854-tls-certificates podName:28ba2caa-20e5-4406-8427-83e5469c3854 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:26.327676847 +0000 UTC m=+173.596597250 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/28ba2caa-20e5-4406-8427-83e5469c3854-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-zkfvp" (UID: "28ba2caa-20e5-4406-8427-83e5469c3854") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 20:30:25.924949 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:25.924861 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" event={"ID":"da561835-fd55-453b-91fa-23a89f82a5f3","Type":"ContainerStarted","Data":"05efeae71b347c2600b4ec5f80e03d439b99b48b462cdf92c5e95cefa4d42929"} Apr 16 20:30:25.939991 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:25.939942 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpq9c" podStartSLOduration=33.948696469 podStartE2EDuration="35.93993017s" podCreationTimestamp="2026-04-16 20:29:50 +0000 UTC" firstStartedPulling="2026-04-16 20:30:23.198737277 +0000 UTC m=+170.467657665" lastFinishedPulling="2026-04-16 20:30:25.189970974 +0000 UTC m=+172.458891366" observedRunningTime="2026-04-16 20:30:25.938938161 +0000 UTC m=+173.207858570" watchObservedRunningTime="2026-04-16 20:30:25.93993017 +0000 UTC m=+173.208850614" Apr 16 20:30:26.331758 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:26.331662 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/28ba2caa-20e5-4406-8427-83e5469c3854-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zkfvp\" (UID: \"28ba2caa-20e5-4406-8427-83e5469c3854\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zkfvp" Apr 16 20:30:26.333942 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:26.333910 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/28ba2caa-20e5-4406-8427-83e5469c3854-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zkfvp\" (UID: \"28ba2caa-20e5-4406-8427-83e5469c3854\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zkfvp" Apr 16 20:30:26.585627 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:26.585540 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zkfvp" Apr 16 20:30:26.703315 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:26.703285 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zkfvp"] Apr 16 20:30:26.706157 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:30:26.706122 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28ba2caa_20e5_4406_8427_83e5469c3854.slice/crio-fa87bfe37b67c4277eae47ad88a6ecf047efa21192a327e99da31b60d00de64d WatchSource:0}: Error finding container fa87bfe37b67c4277eae47ad88a6ecf047efa21192a327e99da31b60d00de64d: Status 404 returned error can't find the container with id fa87bfe37b67c4277eae47ad88a6ecf047efa21192a327e99da31b60d00de64d Apr 16 20:30:26.895121 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:26.895038 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xbszw" Apr 16 20:30:26.929341 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:26.929246 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zkfvp" event={"ID":"28ba2caa-20e5-4406-8427-83e5469c3854","Type":"ContainerStarted","Data":"fa87bfe37b67c4277eae47ad88a6ecf047efa21192a327e99da31b60d00de64d"} Apr 16 20:30:27.933366 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:27.933329 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zkfvp" event={"ID":"28ba2caa-20e5-4406-8427-83e5469c3854","Type":"ContainerStarted","Data":"43feb1902649c7ab5831c28fadf3d0a5827288c0cb78ca755a61ff96813e1be6"} Apr 16 20:30:27.933957 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:27.933565 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zkfvp" Apr 16 20:30:27.938047 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:27.938023 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zkfvp" Apr 16 20:30:27.948458 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:27.948419 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zkfvp" podStartSLOduration=1.8248647550000001 podStartE2EDuration="2.94840982s" podCreationTimestamp="2026-04-16 20:30:25 +0000 UTC" firstStartedPulling="2026-04-16 20:30:26.708164612 +0000 UTC m=+173.977085016" lastFinishedPulling="2026-04-16 20:30:27.831709694 +0000 UTC m=+175.100630081" observedRunningTime="2026-04-16 20:30:27.946892307 +0000 UTC m=+175.215812817" watchObservedRunningTime="2026-04-16 20:30:27.94840982 +0000 UTC m=+175.217330230" Apr 16 20:30:28.735018 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.734989 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-lpkm8"] Apr 16 20:30:28.738246 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.738227 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" Apr 16 20:30:28.740991 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.740971 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 20:30:28.741951 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.741933 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 20:30:28.742042 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.741952 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-s6c2g\"" Apr 16 20:30:28.742042 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.742035 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 20:30:28.749546 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.749525 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-lpkm8"] Apr 16 20:30:28.852670 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.852633 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63d1762d-5ddc-41c4-a3d3-575e6308e36e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-lpkm8\" (UID: \"63d1762d-5ddc-41c4-a3d3-575e6308e36e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" Apr 16 20:30:28.852863 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.852678 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/63d1762d-5ddc-41c4-a3d3-575e6308e36e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-lpkm8\" (UID: \"63d1762d-5ddc-41c4-a3d3-575e6308e36e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" Apr 16 20:30:28.852863 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.852740 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf9kh\" (UniqueName: \"kubernetes.io/projected/63d1762d-5ddc-41c4-a3d3-575e6308e36e-kube-api-access-mf9kh\") pod \"prometheus-operator-5676c8c784-lpkm8\" (UID: \"63d1762d-5ddc-41c4-a3d3-575e6308e36e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" Apr 16 20:30:28.852863 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.852807 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63d1762d-5ddc-41c4-a3d3-575e6308e36e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-lpkm8\" (UID: \"63d1762d-5ddc-41c4-a3d3-575e6308e36e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" Apr 16 20:30:28.953415 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.953382 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63d1762d-5ddc-41c4-a3d3-575e6308e36e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-lpkm8\" (UID: \"63d1762d-5ddc-41c4-a3d3-575e6308e36e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" Apr 16 20:30:28.953415 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.953416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/63d1762d-5ddc-41c4-a3d3-575e6308e36e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-lpkm8\" (UID: \"63d1762d-5ddc-41c4-a3d3-575e6308e36e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" Apr 16 20:30:28.953966 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.953436 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf9kh\" (UniqueName: \"kubernetes.io/projected/63d1762d-5ddc-41c4-a3d3-575e6308e36e-kube-api-access-mf9kh\") pod \"prometheus-operator-5676c8c784-lpkm8\" (UID: \"63d1762d-5ddc-41c4-a3d3-575e6308e36e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" Apr 16 20:30:28.953966 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.953578 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63d1762d-5ddc-41c4-a3d3-575e6308e36e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-lpkm8\" (UID: \"63d1762d-5ddc-41c4-a3d3-575e6308e36e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" Apr 16 20:30:28.954197 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.954175 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63d1762d-5ddc-41c4-a3d3-575e6308e36e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-lpkm8\" (UID: \"63d1762d-5ddc-41c4-a3d3-575e6308e36e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" Apr 16 20:30:28.955811 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.955793 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/63d1762d-5ddc-41c4-a3d3-575e6308e36e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-lpkm8\" (UID: \"63d1762d-5ddc-41c4-a3d3-575e6308e36e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" Apr 16 20:30:28.956408 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.956391 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63d1762d-5ddc-41c4-a3d3-575e6308e36e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-lpkm8\" (UID: \"63d1762d-5ddc-41c4-a3d3-575e6308e36e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" Apr 16 20:30:28.961567 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:28.961540 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf9kh\" (UniqueName: \"kubernetes.io/projected/63d1762d-5ddc-41c4-a3d3-575e6308e36e-kube-api-access-mf9kh\") pod \"prometheus-operator-5676c8c784-lpkm8\" (UID: \"63d1762d-5ddc-41c4-a3d3-575e6308e36e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" Apr 16 20:30:29.047200 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:29.047128 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" Apr 16 20:30:29.165138 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:29.165108 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-lpkm8"] Apr 16 20:30:29.169052 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:30:29.169024 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63d1762d_5ddc_41c4_a3d3_575e6308e36e.slice/crio-699dfaf67099ac739fe15a888d25ae813daa7fa008f4116d469fd824feea0b24 WatchSource:0}: Error finding container 699dfaf67099ac739fe15a888d25ae813daa7fa008f4116d469fd824feea0b24: Status 404 returned error can't find the container with id 699dfaf67099ac739fe15a888d25ae813daa7fa008f4116d469fd824feea0b24 Apr 16 20:30:29.939403 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:29.939369 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" event={"ID":"63d1762d-5ddc-41c4-a3d3-575e6308e36e","Type":"ContainerStarted","Data":"699dfaf67099ac739fe15a888d25ae813daa7fa008f4116d469fd824feea0b24"} Apr 16 20:30:30.944569 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:30.944520 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" event={"ID":"63d1762d-5ddc-41c4-a3d3-575e6308e36e","Type":"ContainerStarted","Data":"185cccac23c0763aa06b97e86e4544ebfa20f85bb48585926b537c087a378a3b"} Apr 16 20:30:30.944569 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:30.944561 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" event={"ID":"63d1762d-5ddc-41c4-a3d3-575e6308e36e","Type":"ContainerStarted","Data":"e8634275767f9781d38ef0e885cb57d51a72163d2d644a05ff56c5256a6de6ec"} Apr 16 20:30:30.962844 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:30.962779 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-lpkm8" podStartSLOduration=1.377011617 podStartE2EDuration="2.962757561s" podCreationTimestamp="2026-04-16 20:30:28 +0000 UTC" firstStartedPulling="2026-04-16 20:30:29.170851633 +0000 UTC m=+176.439772021" lastFinishedPulling="2026-04-16 20:30:30.756597577 +0000 UTC m=+178.025517965" observedRunningTime="2026-04-16 20:30:30.96055263 +0000 UTC m=+178.229473038" watchObservedRunningTime="2026-04-16 20:30:30.962757561 +0000 UTC m=+178.231677974" Apr 16 20:30:33.058365 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.058331 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4"] Apr 16 20:30:33.064706 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.064668 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" Apr 16 20:30:33.067367 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.067344 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 20:30:33.067576 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.067557 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-5ph5d\"" Apr 16 20:30:33.067667 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.067654 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 20:30:33.076565 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.076539 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4"] Apr 16 20:30:33.101889 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.101854 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-98fcz"] Apr 16 20:30:33.105357 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.105303 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.109805 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.109785 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 20:30:33.109983 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.109963 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8h2bq\"" Apr 16 20:30:33.110113 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.110025 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 20:30:33.110113 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.110049 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 20:30:33.190687 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.190650 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/14437e71-f1f0-4889-bc9e-6d4684ff0b5c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vqrb4\" (UID: \"14437e71-f1f0-4889-bc9e-6d4684ff0b5c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" Apr 16 20:30:33.190687 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.190691 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9twc\" (UniqueName: \"kubernetes.io/projected/14437e71-f1f0-4889-bc9e-6d4684ff0b5c-kube-api-access-t9twc\") pod \"openshift-state-metrics-9d44df66c-vqrb4\" (UID: \"14437e71-f1f0-4889-bc9e-6d4684ff0b5c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" Apr 16 20:30:33.190919 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.190739 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-textfile\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.190919 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.190777 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14437e71-f1f0-4889-bc9e-6d4684ff0b5c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vqrb4\" (UID: \"14437e71-f1f0-4889-bc9e-6d4684ff0b5c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" Apr 16 20:30:33.190919 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.190822 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-wtmp\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.190919 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.190850 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-tls\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.190919 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.190871 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb3753c6-0017-4c11-a7e8-751eda08c472-metrics-client-ca\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.190919 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.190889 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-accelerators-collector-config\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.190919 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.190911 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bb3753c6-0017-4c11-a7e8-751eda08c472-root\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.191176 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.190992 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.191176 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.191031 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh2r6\" (UniqueName: \"kubernetes.io/projected/bb3753c6-0017-4c11-a7e8-751eda08c472-kube-api-access-gh2r6\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.191176 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.191058 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb3753c6-0017-4c11-a7e8-751eda08c472-sys\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.191176 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.191100 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14437e71-f1f0-4889-bc9e-6d4684ff0b5c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vqrb4\" (UID: \"14437e71-f1f0-4889-bc9e-6d4684ff0b5c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" Apr 16 20:30:33.292410 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.292376 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/14437e71-f1f0-4889-bc9e-6d4684ff0b5c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vqrb4\" (UID: \"14437e71-f1f0-4889-bc9e-6d4684ff0b5c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" Apr 16 20:30:33.292410 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.292410 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9twc\" (UniqueName: \"kubernetes.io/projected/14437e71-f1f0-4889-bc9e-6d4684ff0b5c-kube-api-access-t9twc\") pod \"openshift-state-metrics-9d44df66c-vqrb4\" (UID: \"14437e71-f1f0-4889-bc9e-6d4684ff0b5c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" Apr 16 20:30:33.292661 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.292430 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-textfile\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.292661 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.292454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14437e71-f1f0-4889-bc9e-6d4684ff0b5c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vqrb4\" (UID: \"14437e71-f1f0-4889-bc9e-6d4684ff0b5c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" Apr 16 20:30:33.292661 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.292630 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-wtmp\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.292814 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.292664 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-tls\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.292814 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.292697 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb3753c6-0017-4c11-a7e8-751eda08c472-metrics-client-ca\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.292814 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.292726 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-accelerators-collector-config\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.292814 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.292767 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bb3753c6-0017-4c11-a7e8-751eda08c472-root\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.292814 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.292802 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-textfile\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.293072 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.292830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.293072 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.292886 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bb3753c6-0017-4c11-a7e8-751eda08c472-root\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.293072 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.292858 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gh2r6\" (UniqueName: \"kubernetes.io/projected/bb3753c6-0017-4c11-a7e8-751eda08c472-kube-api-access-gh2r6\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.293072 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.292933 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb3753c6-0017-4c11-a7e8-751eda08c472-sys\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.293072 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.292983 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14437e71-f1f0-4889-bc9e-6d4684ff0b5c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vqrb4\" (UID: \"14437e71-f1f0-4889-bc9e-6d4684ff0b5c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" Apr 16 20:30:33.293072 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.293047 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb3753c6-0017-4c11-a7e8-751eda08c472-sys\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.293258 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.293214 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb3753c6-0017-4c11-a7e8-751eda08c472-metrics-client-ca\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.293328 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.293314 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-wtmp\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.294116 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.294099 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14437e71-f1f0-4889-bc9e-6d4684ff0b5c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vqrb4\" (UID: \"14437e71-f1f0-4889-bc9e-6d4684ff0b5c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" Apr 16 20:30:33.295293 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.295278 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 20:30:33.295365 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.295347 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 20:30:33.295642 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.295628 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 20:30:33.295679 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.295650 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 20:30:33.295736 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.295723 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 20:30:33.302724 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:30:33.302696 2577 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 20:30:33.302837 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:30:33.302770 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14437e71-f1f0-4889-bc9e-6d4684ff0b5c-openshift-state-metrics-tls podName:14437e71-f1f0-4889-bc9e-6d4684ff0b5c nodeName:}" failed. No retries permitted until 2026-04-16 20:30:33.802749437 +0000 UTC m=+181.071669838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/14437e71-f1f0-4889-bc9e-6d4684ff0b5c-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-vqrb4" (UID: "14437e71-f1f0-4889-bc9e-6d4684ff0b5c") : secret "openshift-state-metrics-tls" not found Apr 16 20:30:33.302911 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:30:33.302872 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 20:30:33.302971 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:30:33.302933 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-tls podName:bb3753c6-0017-4c11-a7e8-751eda08c472 nodeName:}" failed. No retries permitted until 2026-04-16 20:30:33.802911234 +0000 UTC m=+181.071831635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-tls") pod "node-exporter-98fcz" (UID: "bb3753c6-0017-4c11-a7e8-751eda08c472") : secret "node-exporter-tls" not found Apr 16 20:30:33.303145 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.303127 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh2r6\" (UniqueName: \"kubernetes.io/projected/bb3753c6-0017-4c11-a7e8-751eda08c472-kube-api-access-gh2r6\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.303322 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.303296 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9twc\" (UniqueName: \"kubernetes.io/projected/14437e71-f1f0-4889-bc9e-6d4684ff0b5c-kube-api-access-t9twc\") pod \"openshift-state-metrics-9d44df66c-vqrb4\" (UID: \"14437e71-f1f0-4889-bc9e-6d4684ff0b5c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" Apr 16 20:30:33.304001 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.303973 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-accelerators-collector-config\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.305195 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.305170 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14437e71-f1f0-4889-bc9e-6d4684ff0b5c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vqrb4\" (UID: \"14437e71-f1f0-4889-bc9e-6d4684ff0b5c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" Apr 16 20:30:33.305585 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.305570 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.896912 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.896876 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/14437e71-f1f0-4889-bc9e-6d4684ff0b5c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vqrb4\" (UID: \"14437e71-f1f0-4889-bc9e-6d4684ff0b5c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" Apr 16 20:30:33.897122 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.896928 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-tls\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.899308 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.899278 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bb3753c6-0017-4c11-a7e8-751eda08c472-node-exporter-tls\") pod \"node-exporter-98fcz\" (UID: \"bb3753c6-0017-4c11-a7e8-751eda08c472\") " pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:33.899448 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.899329 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/14437e71-f1f0-4889-bc9e-6d4684ff0b5c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vqrb4\" (UID: \"14437e71-f1f0-4889-bc9e-6d4684ff0b5c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" Apr 16 20:30:33.976167 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.976133 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-5ph5d\"" Apr 16 20:30:33.985045 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:33.985021 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" Apr 16 20:30:34.016965 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:34.016935 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8h2bq\"" Apr 16 20:30:34.025825 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:34.024807 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-98fcz" Apr 16 20:30:34.035630 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:30:34.035596 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb3753c6_0017_4c11_a7e8_751eda08c472.slice/crio-54ace11d2b24edaed70b033ffd2948a766f998746cfe0678506e62646992b813 WatchSource:0}: Error finding container 54ace11d2b24edaed70b033ffd2948a766f998746cfe0678506e62646992b813: Status 404 returned error can't find the container with id 54ace11d2b24edaed70b033ffd2948a766f998746cfe0678506e62646992b813 Apr 16 20:30:34.123142 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:34.123111 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4"] Apr 16 20:30:34.126233 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:30:34.126198 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14437e71_f1f0_4889_bc9e_6d4684ff0b5c.slice/crio-34bba09152d68908bef9cfab7f72489de0e4476bce0887db584bef1a2a1d2a8d WatchSource:0}: Error finding container 34bba09152d68908bef9cfab7f72489de0e4476bce0887db584bef1a2a1d2a8d: Status 404 returned error can't find the container with id 34bba09152d68908bef9cfab7f72489de0e4476bce0887db584bef1a2a1d2a8d Apr 16 20:30:34.955361 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:34.955330 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" event={"ID":"14437e71-f1f0-4889-bc9e-6d4684ff0b5c","Type":"ContainerStarted","Data":"7d459411492af34ace4209a1ef9c967151946d7a407d948302bc15526e97e3b1"} Apr 16 20:30:34.955517 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:34.955374 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" event={"ID":"14437e71-f1f0-4889-bc9e-6d4684ff0b5c","Type":"ContainerStarted","Data":"8b42b17509a3362835bb4932b5145eb326ed669c57747a465bbacbd88ce758df"} Apr 16 20:30:34.955517 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:34.955390 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" event={"ID":"14437e71-f1f0-4889-bc9e-6d4684ff0b5c","Type":"ContainerStarted","Data":"34bba09152d68908bef9cfab7f72489de0e4476bce0887db584bef1a2a1d2a8d"} Apr 16 20:30:34.956538 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:34.956512 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-98fcz" event={"ID":"bb3753c6-0017-4c11-a7e8-751eda08c472","Type":"ContainerStarted","Data":"54ace11d2b24edaed70b033ffd2948a766f998746cfe0678506e62646992b813"} Apr 16 20:30:35.175042 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:35.175008 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-6pptv"] Apr 16 20:30:35.178354 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:35.178331 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-6pptv" Apr 16 20:30:35.181326 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:35.181302 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 20:30:35.181437 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:35.181366 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-66ztm\"" Apr 16 20:30:35.181437 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:35.181393 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 20:30:35.185271 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:35.185234 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-6pptv"] Apr 16 20:30:35.310708 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:35.310673 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pf4j\" (UniqueName: \"kubernetes.io/projected/b420aa5e-b5f4-4c5a-963e-ff55196e4ca9-kube-api-access-4pf4j\") pod \"downloads-6bcc868b7-6pptv\" (UID: \"b420aa5e-b5f4-4c5a-963e-ff55196e4ca9\") " pod="openshift-console/downloads-6bcc868b7-6pptv" Apr 16 20:30:35.411744 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:35.411709 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pf4j\" (UniqueName: \"kubernetes.io/projected/b420aa5e-b5f4-4c5a-963e-ff55196e4ca9-kube-api-access-4pf4j\") pod \"downloads-6bcc868b7-6pptv\" (UID: \"b420aa5e-b5f4-4c5a-963e-ff55196e4ca9\") " pod="openshift-console/downloads-6bcc868b7-6pptv" Apr 16 20:30:35.419821 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:35.419793 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pf4j\" (UniqueName: \"kubernetes.io/projected/b420aa5e-b5f4-4c5a-963e-ff55196e4ca9-kube-api-access-4pf4j\") pod \"downloads-6bcc868b7-6pptv\" (UID: \"b420aa5e-b5f4-4c5a-963e-ff55196e4ca9\") " pod="openshift-console/downloads-6bcc868b7-6pptv" Apr 16 20:30:35.489762 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:35.489673 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-6pptv" Apr 16 20:30:35.631192 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:35.631163 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-6pptv"] Apr 16 20:30:35.780960 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:30:35.780887 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb420aa5e_b5f4_4c5a_963e_ff55196e4ca9.slice/crio-76e909334832f543395f045cfcd6a6ecb940042d61ffbfd5a8ef1cbc6270c6ae WatchSource:0}: Error finding container 76e909334832f543395f045cfcd6a6ecb940042d61ffbfd5a8ef1cbc6270c6ae: Status 404 returned error can't find the container with id 76e909334832f543395f045cfcd6a6ecb940042d61ffbfd5a8ef1cbc6270c6ae Apr 16 20:30:35.960899 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:35.960863 2577 generic.go:358] "Generic (PLEG): container finished" podID="bb3753c6-0017-4c11-a7e8-751eda08c472" containerID="ec33ebff4fecde5c3422679a1430c2d8bf5066ce0d13f4b97362c2d6a4c1037f" exitCode=0 Apr 16 20:30:35.961074 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:35.960946 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-98fcz" event={"ID":"bb3753c6-0017-4c11-a7e8-751eda08c472","Type":"ContainerDied","Data":"ec33ebff4fecde5c3422679a1430c2d8bf5066ce0d13f4b97362c2d6a4c1037f"} Apr 16 20:30:35.962970 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:35.962951 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" event={"ID":"14437e71-f1f0-4889-bc9e-6d4684ff0b5c","Type":"ContainerStarted","Data":"f1b2ce9dc99b2015c622241551df4c05516892a1a791c782fc99b5f2b7de717e"} Apr 16 20:30:35.964033 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:35.964013 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-6pptv" event={"ID":"b420aa5e-b5f4-4c5a-963e-ff55196e4ca9","Type":"ContainerStarted","Data":"76e909334832f543395f045cfcd6a6ecb940042d61ffbfd5a8ef1cbc6270c6ae"} Apr 16 20:30:36.008785 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:36.008741 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vqrb4" podStartSLOduration=1.445470848 podStartE2EDuration="3.008725625s" podCreationTimestamp="2026-04-16 20:30:33 +0000 UTC" firstStartedPulling="2026-04-16 20:30:34.270789429 +0000 UTC m=+181.539709823" lastFinishedPulling="2026-04-16 20:30:35.834044209 +0000 UTC m=+183.102964600" observedRunningTime="2026-04-16 20:30:36.007387998 +0000 UTC m=+183.276308409" watchObservedRunningTime="2026-04-16 20:30:36.008725625 +0000 UTC m=+183.277646035" Apr 16 20:30:36.968931 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:36.968891 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-98fcz" event={"ID":"bb3753c6-0017-4c11-a7e8-751eda08c472","Type":"ContainerStarted","Data":"d6f08dbd9920f017c5e8ef47e23b096a2184c9bddd737f63a49cda33a5e7e376"} Apr 16 20:30:36.968931 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:36.968936 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-98fcz" event={"ID":"bb3753c6-0017-4c11-a7e8-751eda08c472","Type":"ContainerStarted","Data":"1e7387261314b8f8121d34b37df8a2c373c83acf7ae895903c1e4bbf2ff2da9a"} Apr 16 20:30:36.989760 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:36.989703 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-98fcz" podStartSLOduration=3.130702862 podStartE2EDuration="3.98968372s" podCreationTimestamp="2026-04-16 20:30:33 +0000 UTC" firstStartedPulling="2026-04-16 20:30:34.037981035 +0000 UTC m=+181.306901468" lastFinishedPulling="2026-04-16 20:30:34.896961935 +0000 UTC m=+182.165882326" observedRunningTime="2026-04-16 20:30:36.989410212 +0000 UTC m=+184.258330630" watchObservedRunningTime="2026-04-16 20:30:36.98968372 +0000 UTC m=+184.258604126" Apr 16 20:30:39.109833 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.108959 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7595676dfd-gm8pm"] Apr 16 20:30:39.112824 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.112792 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.115486 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.115346 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 20:30:39.117003 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.116936 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-j872q\"" Apr 16 20:30:39.117003 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.116971 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 20:30:39.117191 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.117021 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 20:30:39.117191 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.116971 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 20:30:39.117191 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.116941 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 20:30:39.121135 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.121100 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7595676dfd-gm8pm"] Apr 16 20:30:39.247119 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.247089 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0160a02-d88b-428c-a5cf-980839520cbc-console-oauth-config\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.247321 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.247157 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-console-config\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.247321 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.247207 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqszv\" (UniqueName: \"kubernetes.io/projected/c0160a02-d88b-428c-a5cf-980839520cbc-kube-api-access-xqszv\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.247321 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.247239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-oauth-serving-cert\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.247321 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.247262 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-service-ca\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.247321 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.247315 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0160a02-d88b-428c-a5cf-980839520cbc-console-serving-cert\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.348720 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.348680 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0160a02-d88b-428c-a5cf-980839520cbc-console-serving-cert\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.348890 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.348835 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0160a02-d88b-428c-a5cf-980839520cbc-console-oauth-config\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.348951 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.348895 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-console-config\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.348951 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.348926 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqszv\" (UniqueName: \"kubernetes.io/projected/c0160a02-d88b-428c-a5cf-980839520cbc-kube-api-access-xqszv\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.349066 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.348957 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-oauth-serving-cert\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.349066 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.348988 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-service-ca\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.349773 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.349742 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-service-ca\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.349913 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.349781 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-oauth-serving-cert\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.349913 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.349849 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-console-config\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.353496 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.353450 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0160a02-d88b-428c-a5cf-980839520cbc-console-oauth-config\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.353684 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.353601 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0160a02-d88b-428c-a5cf-980839520cbc-console-serving-cert\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.356686 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.356662 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqszv\" (UniqueName: \"kubernetes.io/projected/c0160a02-d88b-428c-a5cf-980839520cbc-kube-api-access-xqszv\") pod \"console-7595676dfd-gm8pm\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.427306 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.427266 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:39.573824 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.573787 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7595676dfd-gm8pm"] Apr 16 20:30:39.577667 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:30:39.577635 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0160a02_d88b_428c_a5cf_980839520cbc.slice/crio-e9a8e83735ff93683ad3b567e515c798096a1f19b7c3ab8e252d7ca6baef32d4 WatchSource:0}: Error finding container e9a8e83735ff93683ad3b567e515c798096a1f19b7c3ab8e252d7ca6baef32d4: Status 404 returned error can't find the container with id e9a8e83735ff93683ad3b567e515c798096a1f19b7c3ab8e252d7ca6baef32d4 Apr 16 20:30:39.907783 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.907709 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7c9d75854d-g48xr" Apr 16 20:30:39.979226 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:39.979187 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7595676dfd-gm8pm" event={"ID":"c0160a02-d88b-428c-a5cf-980839520cbc","Type":"ContainerStarted","Data":"e9a8e83735ff93683ad3b567e515c798096a1f19b7c3ab8e252d7ca6baef32d4"} Apr 16 20:30:42.989988 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:42.989938 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7595676dfd-gm8pm" event={"ID":"c0160a02-d88b-428c-a5cf-980839520cbc","Type":"ContainerStarted","Data":"072b5f60154dc910da8112b50dd9a7165918ab2a3624d459dec17b10c8dc02cb"} Apr 16 20:30:43.009865 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:43.009812 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7595676dfd-gm8pm" podStartSLOduration=0.737563061 podStartE2EDuration="4.009795678s" podCreationTimestamp="2026-04-16 20:30:39 +0000 UTC" firstStartedPulling="2026-04-16 20:30:39.580080413 +0000 UTC m=+186.849000806" lastFinishedPulling="2026-04-16 20:30:42.852313032 +0000 UTC m=+190.121233423" observedRunningTime="2026-04-16 20:30:43.008506267 +0000 UTC m=+190.277426677" watchObservedRunningTime="2026-04-16 20:30:43.009795678 +0000 UTC m=+190.278716082" Apr 16 20:30:46.768569 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:46.768537 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fd5789f67-7jtz6"] Apr 16 20:30:46.771762 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:46.771736 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:46.781082 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:46.780523 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 20:30:46.781782 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:46.781744 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fd5789f67-7jtz6"] Apr 16 20:30:46.914208 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:46.914168 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-trusted-ca-bundle\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:46.914208 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:46.914214 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-oauth-config\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:46.914432 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:46.914277 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-oauth-serving-cert\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:46.914432 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:46.914292 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q74w5\" (UniqueName: \"kubernetes.io/projected/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-kube-api-access-q74w5\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:46.914432 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:46.914394 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-service-ca\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:46.914556 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:46.914435 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-serving-cert\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:46.914556 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:46.914487 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-config\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:47.015346 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:47.015310 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-service-ca\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:47.015561 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:47.015361 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-serving-cert\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:47.015561 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:47.015502 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-config\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:47.015561 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:47.015551 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-trusted-ca-bundle\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:47.015861 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:47.015586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-oauth-config\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:47.015861 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:47.015649 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-oauth-serving-cert\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:47.015861 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:47.015671 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q74w5\" (UniqueName: \"kubernetes.io/projected/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-kube-api-access-q74w5\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:47.016387 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:47.016361 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-config\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:47.016510 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:47.016391 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-oauth-serving-cert\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:47.016629 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:47.016603 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-trusted-ca-bundle\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:47.017069 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:47.017051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-service-ca\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:47.018539 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:47.018517 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-oauth-config\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:47.018638 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:47.018560 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-serving-cert\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:47.024145 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:47.024123 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q74w5\" (UniqueName: \"kubernetes.io/projected/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-kube-api-access-q74w5\") pod \"console-6fd5789f67-7jtz6\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:47.083235 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:47.083194 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:47.213271 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:47.213191 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fd5789f67-7jtz6"] Apr 16 20:30:47.215702 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:30:47.215666 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73c9da04_0f06_424c_a3d6_ccaea12d2bb4.slice/crio-1494e060dd7e45a7f3d2c59d2f6359b44992d6f9d0919746f35f2dbc62bb1658 WatchSource:0}: Error finding container 1494e060dd7e45a7f3d2c59d2f6359b44992d6f9d0919746f35f2dbc62bb1658: Status 404 returned error can't find the container with id 1494e060dd7e45a7f3d2c59d2f6359b44992d6f9d0919746f35f2dbc62bb1658 Apr 16 20:30:48.007419 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:48.007381 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd5789f67-7jtz6" event={"ID":"73c9da04-0f06-424c-a3d6-ccaea12d2bb4","Type":"ContainerStarted","Data":"f9ee37314e369badcfdaf7e743777bdae44817595460b082375f125da381d085"} Apr 16 20:30:48.007419 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:48.007422 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd5789f67-7jtz6" event={"ID":"73c9da04-0f06-424c-a3d6-ccaea12d2bb4","Type":"ContainerStarted","Data":"1494e060dd7e45a7f3d2c59d2f6359b44992d6f9d0919746f35f2dbc62bb1658"} Apr 16 20:30:48.024344 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:48.024292 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fd5789f67-7jtz6" podStartSLOduration=2.024272324 podStartE2EDuration="2.024272324s" podCreationTimestamp="2026-04-16 20:30:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:30:48.02251081 +0000 UTC m=+195.291431219" watchObservedRunningTime="2026-04-16 20:30:48.024272324 +0000 UTC m=+195.293192734" Apr 16 20:30:49.428244 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:49.428210 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:49.428244 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:49.428251 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:49.433300 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:49.433276 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:50.019710 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:50.019678 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:30:57.036615 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:57.036570 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-6pptv" event={"ID":"b420aa5e-b5f4-4c5a-963e-ff55196e4ca9","Type":"ContainerStarted","Data":"390e702148c784dc86c320dc26291febba48ebf3e71146e3be0323c99ee2f04b"} Apr 16 20:30:57.037068 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:57.036738 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-6pptv" Apr 16 20:30:57.056479 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:57.056408 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-6pptv" podStartSLOduration=1.635350845 podStartE2EDuration="22.05638724s" podCreationTimestamp="2026-04-16 20:30:35 +0000 UTC" firstStartedPulling="2026-04-16 20:30:35.783492951 +0000 UTC m=+183.052413339" lastFinishedPulling="2026-04-16 20:30:56.204529346 +0000 UTC m=+203.473449734" observedRunningTime="2026-04-16 20:30:57.054724193 +0000 UTC m=+204.323644603" watchObservedRunningTime="2026-04-16 20:30:57.05638724 +0000 UTC m=+204.325307686" Apr 16 20:30:57.062583 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:57.062553 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-6pptv" Apr 16 20:30:57.084338 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:57.084306 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:57.084595 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:57.084573 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:57.090596 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:57.090567 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:58.044482 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:58.044429 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:30:58.088741 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:30:58.088713 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7595676dfd-gm8pm"] Apr 16 20:31:05.528141 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:05.528115 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-cc5b6bc9d-d9pl5_b0125b55-3e0c-4bba-b620-3460a3974959/router/0.log" Apr 16 20:31:05.545348 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:05.545320 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-skwc8_f1de9e75-c8b2-4fee-898a-82488ff8d677/serve-healthcheck-canary/0.log" Apr 16 20:31:11.078345 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:11.078312 2577 generic.go:358] "Generic (PLEG): container finished" podID="1e4283e6-7889-48d8-acdf-e35108f466bb" containerID="573b1d76500b6f4044c4ec14c7d95fd1ba7f5e08efc19e21f4cf83c95fca93b8" exitCode=0 Apr 16 20:31:11.078830 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:11.078405 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6" event={"ID":"1e4283e6-7889-48d8-acdf-e35108f466bb","Type":"ContainerDied","Data":"573b1d76500b6f4044c4ec14c7d95fd1ba7f5e08efc19e21f4cf83c95fca93b8"} Apr 16 20:31:11.078830 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:11.078771 2577 scope.go:117] "RemoveContainer" containerID="573b1d76500b6f4044c4ec14c7d95fd1ba7f5e08efc19e21f4cf83c95fca93b8" Apr 16 20:31:12.082646 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:12.082603 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tdhc6" event={"ID":"1e4283e6-7889-48d8-acdf-e35108f466bb","Type":"ContainerStarted","Data":"5fba5c94876af68a57ea5ac7e1dd3ef92f6ce9646d5dbc4c90dd50efd63a27bc"} Apr 16 20:31:15.091840 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:15.091805 2577 generic.go:358] "Generic (PLEG): container finished" podID="1a4fa98c-05e4-48fe-93c7-d01bd593d03a" containerID="35414d1a433246b72be9d1f8f079d3d0893d78b4dd98e2bec641ba486bdfa4c6" exitCode=0 Apr 16 20:31:15.092210 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:15.091877 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb" event={"ID":"1a4fa98c-05e4-48fe-93c7-d01bd593d03a","Type":"ContainerDied","Data":"35414d1a433246b72be9d1f8f079d3d0893d78b4dd98e2bec641ba486bdfa4c6"} Apr 16 20:31:15.092210 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:15.092165 2577 scope.go:117] "RemoveContainer" containerID="35414d1a433246b72be9d1f8f079d3d0893d78b4dd98e2bec641ba486bdfa4c6" Apr 16 20:31:16.096704 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:16.096675 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kqkjb" event={"ID":"1a4fa98c-05e4-48fe-93c7-d01bd593d03a","Type":"ContainerStarted","Data":"2acf7075824839515e0d00ea709906f23034878ad0897e3cbdaf5c89fe82ccb4"} Apr 16 20:31:23.123792 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.123751 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7595676dfd-gm8pm" podUID="c0160a02-d88b-428c-a5cf-980839520cbc" containerName="console" containerID="cri-o://072b5f60154dc910da8112b50dd9a7165918ab2a3624d459dec17b10c8dc02cb" gracePeriod=15 Apr 16 20:31:23.403753 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.403732 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7595676dfd-gm8pm_c0160a02-d88b-428c-a5cf-980839520cbc/console/0.log" Apr 16 20:31:23.403886 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.403817 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:31:23.440983 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.440943 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0160a02-d88b-428c-a5cf-980839520cbc-console-serving-cert\") pod \"c0160a02-d88b-428c-a5cf-980839520cbc\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " Apr 16 20:31:23.441135 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.441016 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-console-config\") pod \"c0160a02-d88b-428c-a5cf-980839520cbc\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " Apr 16 20:31:23.441135 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.441043 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-service-ca\") pod \"c0160a02-d88b-428c-a5cf-980839520cbc\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " Apr 16 20:31:23.441135 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.441100 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqszv\" (UniqueName: \"kubernetes.io/projected/c0160a02-d88b-428c-a5cf-980839520cbc-kube-api-access-xqszv\") pod \"c0160a02-d88b-428c-a5cf-980839520cbc\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " Apr 16 20:31:23.441276 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.441155 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0160a02-d88b-428c-a5cf-980839520cbc-console-oauth-config\") pod \"c0160a02-d88b-428c-a5cf-980839520cbc\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " Apr 16 20:31:23.441276 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.441199 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-oauth-serving-cert\") pod \"c0160a02-d88b-428c-a5cf-980839520cbc\" (UID: \"c0160a02-d88b-428c-a5cf-980839520cbc\") " Apr 16 20:31:23.441455 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.441428 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-console-config" (OuterVolumeSpecName: "console-config") pod "c0160a02-d88b-428c-a5cf-980839520cbc" (UID: "c0160a02-d88b-428c-a5cf-980839520cbc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:31:23.441553 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.441449 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-service-ca" (OuterVolumeSpecName: "service-ca") pod "c0160a02-d88b-428c-a5cf-980839520cbc" (UID: "c0160a02-d88b-428c-a5cf-980839520cbc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:31:23.441772 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.441747 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c0160a02-d88b-428c-a5cf-980839520cbc" (UID: "c0160a02-d88b-428c-a5cf-980839520cbc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:31:23.443317 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.443291 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0160a02-d88b-428c-a5cf-980839520cbc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c0160a02-d88b-428c-a5cf-980839520cbc" (UID: "c0160a02-d88b-428c-a5cf-980839520cbc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:23.443408 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.443356 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0160a02-d88b-428c-a5cf-980839520cbc-kube-api-access-xqszv" (OuterVolumeSpecName: "kube-api-access-xqszv") pod "c0160a02-d88b-428c-a5cf-980839520cbc" (UID: "c0160a02-d88b-428c-a5cf-980839520cbc"). InnerVolumeSpecName "kube-api-access-xqszv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:31:23.443408 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.443366 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0160a02-d88b-428c-a5cf-980839520cbc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c0160a02-d88b-428c-a5cf-980839520cbc" (UID: "c0160a02-d88b-428c-a5cf-980839520cbc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:31:23.542351 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.542316 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-oauth-serving-cert\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:31:23.542351 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.542348 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0160a02-d88b-428c-a5cf-980839520cbc-console-serving-cert\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:31:23.542351 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.542358 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-console-config\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:31:23.542622 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.542369 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0160a02-d88b-428c-a5cf-980839520cbc-service-ca\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:31:23.542622 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.542378 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xqszv\" (UniqueName: \"kubernetes.io/projected/c0160a02-d88b-428c-a5cf-980839520cbc-kube-api-access-xqszv\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:31:23.542622 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:23.542387 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0160a02-d88b-428c-a5cf-980839520cbc-console-oauth-config\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:31:24.120766 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:24.120738 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7595676dfd-gm8pm_c0160a02-d88b-428c-a5cf-980839520cbc/console/0.log" Apr 16 20:31:24.120970 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:24.120777 2577 generic.go:358] "Generic (PLEG): container finished" podID="c0160a02-d88b-428c-a5cf-980839520cbc" containerID="072b5f60154dc910da8112b50dd9a7165918ab2a3624d459dec17b10c8dc02cb" exitCode=2 Apr 16 20:31:24.120970 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:24.120816 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7595676dfd-gm8pm" event={"ID":"c0160a02-d88b-428c-a5cf-980839520cbc","Type":"ContainerDied","Data":"072b5f60154dc910da8112b50dd9a7165918ab2a3624d459dec17b10c8dc02cb"} Apr 16 20:31:24.120970 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:24.120864 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7595676dfd-gm8pm" event={"ID":"c0160a02-d88b-428c-a5cf-980839520cbc","Type":"ContainerDied","Data":"e9a8e83735ff93683ad3b567e515c798096a1f19b7c3ab8e252d7ca6baef32d4"} Apr 16 20:31:24.120970 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:24.120866 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7595676dfd-gm8pm" Apr 16 20:31:24.120970 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:24.120878 2577 scope.go:117] "RemoveContainer" containerID="072b5f60154dc910da8112b50dd9a7165918ab2a3624d459dec17b10c8dc02cb" Apr 16 20:31:24.129718 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:24.129581 2577 scope.go:117] "RemoveContainer" containerID="072b5f60154dc910da8112b50dd9a7165918ab2a3624d459dec17b10c8dc02cb" Apr 16 20:31:24.129931 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:31:24.129856 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"072b5f60154dc910da8112b50dd9a7165918ab2a3624d459dec17b10c8dc02cb\": container with ID starting with 072b5f60154dc910da8112b50dd9a7165918ab2a3624d459dec17b10c8dc02cb not found: ID does not exist" containerID="072b5f60154dc910da8112b50dd9a7165918ab2a3624d459dec17b10c8dc02cb" Apr 16 20:31:24.129931 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:24.129884 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"072b5f60154dc910da8112b50dd9a7165918ab2a3624d459dec17b10c8dc02cb"} err="failed to get container status \"072b5f60154dc910da8112b50dd9a7165918ab2a3624d459dec17b10c8dc02cb\": rpc error: code = NotFound desc = could not find container \"072b5f60154dc910da8112b50dd9a7165918ab2a3624d459dec17b10c8dc02cb\": container with ID starting with 072b5f60154dc910da8112b50dd9a7165918ab2a3624d459dec17b10c8dc02cb not found: ID does not exist" Apr 16 20:31:24.145981 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:24.145951 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7595676dfd-gm8pm"] Apr 16 20:31:24.152084 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:24.152056 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7595676dfd-gm8pm"] Apr 16 20:31:25.283210 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:25.283176 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0160a02-d88b-428c-a5cf-980839520cbc" path="/var/lib/kubelet/pods/c0160a02-d88b-428c-a5cf-980839520cbc/volumes" Apr 16 20:31:44.111073 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:44.111019 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs\") pod \"network-metrics-daemon-b8p9v\" (UID: \"30274609-546d-4c7b-abd0-8907fd0a6cd7\") " pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:31:44.113450 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:44.113428 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30274609-546d-4c7b-abd0-8907fd0a6cd7-metrics-certs\") pod \"network-metrics-daemon-b8p9v\" (UID: \"30274609-546d-4c7b-abd0-8907fd0a6cd7\") " pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:31:44.383811 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:44.383723 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v8bd2\"" Apr 16 20:31:44.391843 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:44.391813 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b8p9v" Apr 16 20:31:44.511846 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:44.511818 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-b8p9v"] Apr 16 20:31:44.514405 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:31:44.514373 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30274609_546d_4c7b_abd0_8907fd0a6cd7.slice/crio-163babf27a0edb1ea9f7a155797091ffd841b21cda79e6f162b13e33f313f96e WatchSource:0}: Error finding container 163babf27a0edb1ea9f7a155797091ffd841b21cda79e6f162b13e33f313f96e: Status 404 returned error can't find the container with id 163babf27a0edb1ea9f7a155797091ffd841b21cda79e6f162b13e33f313f96e Apr 16 20:31:45.185978 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:45.185946 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-b8p9v" event={"ID":"30274609-546d-4c7b-abd0-8907fd0a6cd7","Type":"ContainerStarted","Data":"163babf27a0edb1ea9f7a155797091ffd841b21cda79e6f162b13e33f313f96e"} Apr 16 20:31:47.192579 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:47.192536 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-b8p9v" event={"ID":"30274609-546d-4c7b-abd0-8907fd0a6cd7","Type":"ContainerStarted","Data":"9db7edbba1cd851c8b66e0fa0236dd5dcc2aaefc30d6091b6f5891f709306da3"} Apr 16 20:31:47.192579 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:47.192578 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-b8p9v" event={"ID":"30274609-546d-4c7b-abd0-8907fd0a6cd7","Type":"ContainerStarted","Data":"071f4fe3363a7ad9d5a6e54d0a92020c66b62016caf668f1727530836e2782bc"} Apr 16 20:31:47.207159 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:47.207112 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-b8p9v" podStartSLOduration=252.43815209 podStartE2EDuration="4m14.207094781s" podCreationTimestamp="2026-04-16 20:27:33 +0000 UTC" firstStartedPulling="2026-04-16 20:31:44.516207001 +0000 UTC m=+251.785127388" lastFinishedPulling="2026-04-16 20:31:46.285149679 +0000 UTC m=+253.554070079" observedRunningTime="2026-04-16 20:31:47.206720014 +0000 UTC m=+254.475640448" watchObservedRunningTime="2026-04-16 20:31:47.207094781 +0000 UTC m=+254.476015193" Apr 16 20:31:51.015109 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.015067 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-cbd54666c-prnm9"] Apr 16 20:31:51.015504 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.015386 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0160a02-d88b-428c-a5cf-980839520cbc" containerName="console" Apr 16 20:31:51.015504 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.015398 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0160a02-d88b-428c-a5cf-980839520cbc" containerName="console" Apr 16 20:31:51.015504 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.015454 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0160a02-d88b-428c-a5cf-980839520cbc" containerName="console" Apr 16 20:31:51.018619 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.018602 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.027419 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.027395 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cbd54666c-prnm9"] Apr 16 20:31:51.176390 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.176351 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg9tc\" (UniqueName: \"kubernetes.io/projected/a47ef05d-c48a-4d72-8b8a-88348425f32f-kube-api-access-gg9tc\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.176598 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.176402 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-oauth-config\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.176598 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.176425 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-trusted-ca-bundle\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.176598 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.176453 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-config\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.176598 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.176497 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-service-ca\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.176787 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.176602 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-oauth-serving-cert\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.176787 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.176646 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-serving-cert\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.277602 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.277498 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-oauth-serving-cert\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.277602 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.277553 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-serving-cert\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.277602 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.277591 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gg9tc\" (UniqueName: \"kubernetes.io/projected/a47ef05d-c48a-4d72-8b8a-88348425f32f-kube-api-access-gg9tc\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.277879 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.277617 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-oauth-config\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.277879 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.277634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-trusted-ca-bundle\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.277879 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.277660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-config\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.277879 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.277682 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-service-ca\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.278360 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.278332 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-service-ca\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.278499 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.278395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-oauth-serving-cert\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.278499 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.278431 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-config\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.278654 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.278629 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-trusted-ca-bundle\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.280254 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.280234 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-serving-cert\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.280494 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.280452 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-oauth-config\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.286883 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.286864 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg9tc\" (UniqueName: \"kubernetes.io/projected/a47ef05d-c48a-4d72-8b8a-88348425f32f-kube-api-access-gg9tc\") pod \"console-cbd54666c-prnm9\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.328811 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.328771 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:31:51.446829 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:51.446804 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cbd54666c-prnm9"] Apr 16 20:31:51.449132 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:31:51.449102 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda47ef05d_c48a_4d72_8b8a_88348425f32f.slice/crio-a3f658bc72583f3f4039a44e9898641df55c943d407f66818dd5b51091185372 WatchSource:0}: Error finding container a3f658bc72583f3f4039a44e9898641df55c943d407f66818dd5b51091185372: Status 404 returned error can't find the container with id a3f658bc72583f3f4039a44e9898641df55c943d407f66818dd5b51091185372 Apr 16 20:31:52.210639 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:52.210603 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cbd54666c-prnm9" event={"ID":"a47ef05d-c48a-4d72-8b8a-88348425f32f","Type":"ContainerStarted","Data":"7bb09b7e88be9ea3f426de46048a0074feeb11abdc1e6efe4b68d854bc804f16"} Apr 16 20:31:52.211010 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:52.210646 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cbd54666c-prnm9" event={"ID":"a47ef05d-c48a-4d72-8b8a-88348425f32f","Type":"ContainerStarted","Data":"a3f658bc72583f3f4039a44e9898641df55c943d407f66818dd5b51091185372"} Apr 16 20:31:52.227773 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:31:52.227718 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cbd54666c-prnm9" podStartSLOduration=2.227701103 podStartE2EDuration="2.227701103s" podCreationTimestamp="2026-04-16 20:31:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:31:52.226296325 +0000 UTC m=+259.495216716" watchObservedRunningTime="2026-04-16 20:31:52.227701103 +0000 UTC m=+259.496621516" Apr 16 20:32:01.329500 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:01.329449 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:32:01.329500 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:01.329502 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:32:01.334071 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:01.334046 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:32:02.242024 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:02.241996 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:32:02.288243 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:02.288213 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fd5789f67-7jtz6"] Apr 16 20:32:27.306963 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.306919 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6fd5789f67-7jtz6" podUID="73c9da04-0f06-424c-a3d6-ccaea12d2bb4" containerName="console" containerID="cri-o://f9ee37314e369badcfdaf7e743777bdae44817595460b082375f125da381d085" gracePeriod=15 Apr 16 20:32:27.551901 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.551878 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fd5789f67-7jtz6_73c9da04-0f06-424c-a3d6-ccaea12d2bb4/console/0.log" Apr 16 20:32:27.552032 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.551942 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:32:27.690822 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.690789 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-oauth-config\") pod \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " Apr 16 20:32:27.690989 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.690830 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-config\") pod \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " Apr 16 20:32:27.690989 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.690891 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-service-ca\") pod \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " Apr 16 20:32:27.690989 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.690909 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-oauth-serving-cert\") pod \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " Apr 16 20:32:27.690989 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.690929 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-serving-cert\") pod \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " Apr 16 20:32:27.690989 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.690949 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q74w5\" (UniqueName: \"kubernetes.io/projected/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-kube-api-access-q74w5\") pod \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " Apr 16 20:32:27.690989 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.690974 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-trusted-ca-bundle\") pod \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\" (UID: \"73c9da04-0f06-424c-a3d6-ccaea12d2bb4\") " Apr 16 20:32:27.691368 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.691328 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-config" (OuterVolumeSpecName: "console-config") pod "73c9da04-0f06-424c-a3d6-ccaea12d2bb4" (UID: "73c9da04-0f06-424c-a3d6-ccaea12d2bb4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:32:27.691515 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.691457 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "73c9da04-0f06-424c-a3d6-ccaea12d2bb4" (UID: "73c9da04-0f06-424c-a3d6-ccaea12d2bb4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:32:27.691515 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.691340 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-service-ca" (OuterVolumeSpecName: "service-ca") pod "73c9da04-0f06-424c-a3d6-ccaea12d2bb4" (UID: "73c9da04-0f06-424c-a3d6-ccaea12d2bb4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:32:27.691515 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.691367 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "73c9da04-0f06-424c-a3d6-ccaea12d2bb4" (UID: "73c9da04-0f06-424c-a3d6-ccaea12d2bb4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:32:27.693098 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.693069 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "73c9da04-0f06-424c-a3d6-ccaea12d2bb4" (UID: "73c9da04-0f06-424c-a3d6-ccaea12d2bb4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:32:27.693098 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.693083 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-kube-api-access-q74w5" (OuterVolumeSpecName: "kube-api-access-q74w5") pod "73c9da04-0f06-424c-a3d6-ccaea12d2bb4" (UID: "73c9da04-0f06-424c-a3d6-ccaea12d2bb4"). InnerVolumeSpecName "kube-api-access-q74w5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:32:27.693213 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.693109 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "73c9da04-0f06-424c-a3d6-ccaea12d2bb4" (UID: "73c9da04-0f06-424c-a3d6-ccaea12d2bb4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:32:27.791803 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.791770 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-oauth-config\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:32:27.791803 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.791798 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-config\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:32:27.791803 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.791809 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-service-ca\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:32:27.792012 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.791818 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-oauth-serving-cert\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:32:27.792012 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.791827 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-console-serving-cert\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:32:27.792012 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.791835 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q74w5\" (UniqueName: \"kubernetes.io/projected/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-kube-api-access-q74w5\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:32:27.792012 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:27.791846 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73c9da04-0f06-424c-a3d6-ccaea12d2bb4-trusted-ca-bundle\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:32:28.309969 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:28.309943 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fd5789f67-7jtz6_73c9da04-0f06-424c-a3d6-ccaea12d2bb4/console/0.log" Apr 16 20:32:28.310438 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:28.309982 2577 generic.go:358] "Generic (PLEG): container finished" podID="73c9da04-0f06-424c-a3d6-ccaea12d2bb4" containerID="f9ee37314e369badcfdaf7e743777bdae44817595460b082375f125da381d085" exitCode=2 Apr 16 20:32:28.310438 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:28.310063 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd5789f67-7jtz6" Apr 16 20:32:28.310438 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:28.310072 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd5789f67-7jtz6" event={"ID":"73c9da04-0f06-424c-a3d6-ccaea12d2bb4","Type":"ContainerDied","Data":"f9ee37314e369badcfdaf7e743777bdae44817595460b082375f125da381d085"} Apr 16 20:32:28.310438 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:28.310102 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd5789f67-7jtz6" event={"ID":"73c9da04-0f06-424c-a3d6-ccaea12d2bb4","Type":"ContainerDied","Data":"1494e060dd7e45a7f3d2c59d2f6359b44992d6f9d0919746f35f2dbc62bb1658"} Apr 16 20:32:28.310438 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:28.310118 2577 scope.go:117] "RemoveContainer" containerID="f9ee37314e369badcfdaf7e743777bdae44817595460b082375f125da381d085" Apr 16 20:32:28.318064 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:28.318040 2577 scope.go:117] "RemoveContainer" containerID="f9ee37314e369badcfdaf7e743777bdae44817595460b082375f125da381d085" Apr 16 20:32:28.318370 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:32:28.318331 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9ee37314e369badcfdaf7e743777bdae44817595460b082375f125da381d085\": container with ID starting with f9ee37314e369badcfdaf7e743777bdae44817595460b082375f125da381d085 not found: ID does not exist" containerID="f9ee37314e369badcfdaf7e743777bdae44817595460b082375f125da381d085" Apr 16 20:32:28.318370 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:28.318360 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9ee37314e369badcfdaf7e743777bdae44817595460b082375f125da381d085"} err="failed to get container status \"f9ee37314e369badcfdaf7e743777bdae44817595460b082375f125da381d085\": rpc error: code = NotFound desc = could not find container \"f9ee37314e369badcfdaf7e743777bdae44817595460b082375f125da381d085\": container with ID starting with f9ee37314e369badcfdaf7e743777bdae44817595460b082375f125da381d085 not found: ID does not exist" Apr 16 20:32:28.329941 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:28.329919 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fd5789f67-7jtz6"] Apr 16 20:32:28.333534 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:28.333513 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6fd5789f67-7jtz6"] Apr 16 20:32:29.284217 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:29.284174 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73c9da04-0f06-424c-a3d6-ccaea12d2bb4" path="/var/lib/kubelet/pods/73c9da04-0f06-424c-a3d6-ccaea12d2bb4/volumes" Apr 16 20:32:33.190419 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:33.190390 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 20:32:33.191101 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:33.190390 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 20:32:33.196709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:32:33.196687 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 20:33:02.903087 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:02.903051 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m"] Apr 16 20:33:02.905450 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:02.903364 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73c9da04-0f06-424c-a3d6-ccaea12d2bb4" containerName="console" Apr 16 20:33:02.905450 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:02.903380 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c9da04-0f06-424c-a3d6-ccaea12d2bb4" containerName="console" Apr 16 20:33:02.905450 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:02.903432 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="73c9da04-0f06-424c-a3d6-ccaea12d2bb4" containerName="console" Apr 16 20:33:02.906277 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:02.906262 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" Apr 16 20:33:02.908793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:02.908772 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:33:02.909740 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:02.909718 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5kphd\"" Apr 16 20:33:02.909840 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:02.909718 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:33:02.914222 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:02.914202 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m"] Apr 16 20:33:02.965173 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:02.965137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m\" (UID: \"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" Apr 16 20:33:02.965351 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:02.965192 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jjfs\" (UniqueName: \"kubernetes.io/projected/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-kube-api-access-5jjfs\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m\" (UID: \"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" Apr 16 20:33:02.965351 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:02.965256 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m\" (UID: \"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" Apr 16 20:33:03.066558 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:03.066518 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m\" (UID: \"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" Apr 16 20:33:03.066749 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:03.066601 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jjfs\" (UniqueName: \"kubernetes.io/projected/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-kube-api-access-5jjfs\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m\" (UID: \"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" Apr 16 20:33:03.066749 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:03.066634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m\" (UID: \"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" Apr 16 20:33:03.066873 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:03.066860 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m\" (UID: \"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" Apr 16 20:33:03.066961 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:03.066943 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m\" (UID: \"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" Apr 16 20:33:03.075870 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:03.075846 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jjfs\" (UniqueName: \"kubernetes.io/projected/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-kube-api-access-5jjfs\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m\" (UID: \"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" Apr 16 20:33:03.215858 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:03.215827 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" Apr 16 20:33:03.341030 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:03.340996 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m"] Apr 16 20:33:03.344130 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:33:03.344104 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedb2391c_fe2c_49bd_9ac2_ba6fd39c7869.slice/crio-705f901f3040edbcc3a8e55bf59bd2e6f97aa25b4a0393484c03448c2d588610 WatchSource:0}: Error finding container 705f901f3040edbcc3a8e55bf59bd2e6f97aa25b4a0393484c03448c2d588610: Status 404 returned error can't find the container with id 705f901f3040edbcc3a8e55bf59bd2e6f97aa25b4a0393484c03448c2d588610 Apr 16 20:33:03.346544 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:03.346526 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:33:03.405800 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:03.405770 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" event={"ID":"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869","Type":"ContainerStarted","Data":"705f901f3040edbcc3a8e55bf59bd2e6f97aa25b4a0393484c03448c2d588610"} Apr 16 20:33:11.428947 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:11.428908 2577 generic.go:358] "Generic (PLEG): container finished" podID="edb2391c-fe2c-49bd-9ac2-ba6fd39c7869" containerID="a4e177899ca2e5a2abcfe13aebd47d14f7c741b4aae04c313b36043503570e72" exitCode=0 Apr 16 20:33:11.429346 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:11.428989 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" event={"ID":"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869","Type":"ContainerDied","Data":"a4e177899ca2e5a2abcfe13aebd47d14f7c741b4aae04c313b36043503570e72"} Apr 16 20:33:14.440479 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:14.440429 2577 generic.go:358] "Generic (PLEG): container finished" podID="edb2391c-fe2c-49bd-9ac2-ba6fd39c7869" containerID="a01bb903413cbf31c2533ea77af0d1d16f908f71529e505f3c7fc2c6c16243c9" exitCode=0 Apr 16 20:33:14.440859 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:14.440515 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" event={"ID":"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869","Type":"ContainerDied","Data":"a01bb903413cbf31c2533ea77af0d1d16f908f71529e505f3c7fc2c6c16243c9"} Apr 16 20:33:23.471754 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:23.471718 2577 generic.go:358] "Generic (PLEG): container finished" podID="edb2391c-fe2c-49bd-9ac2-ba6fd39c7869" containerID="2fd6821e27bec2f02f811c0e396ec1f7ff3f9c6b81550682e9ace3d51ef9b25f" exitCode=0 Apr 16 20:33:23.472158 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:23.471770 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" event={"ID":"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869","Type":"ContainerDied","Data":"2fd6821e27bec2f02f811c0e396ec1f7ff3f9c6b81550682e9ace3d51ef9b25f"} Apr 16 20:33:24.592042 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:24.592017 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" Apr 16 20:33:24.648618 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:24.648582 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-bundle\") pod \"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869\" (UID: \"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869\") " Apr 16 20:33:24.648803 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:24.648649 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jjfs\" (UniqueName: \"kubernetes.io/projected/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-kube-api-access-5jjfs\") pod \"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869\" (UID: \"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869\") " Apr 16 20:33:24.648803 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:24.648683 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-util\") pod \"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869\" (UID: \"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869\") " Apr 16 20:33:24.649258 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:24.649228 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-bundle" (OuterVolumeSpecName: "bundle") pod "edb2391c-fe2c-49bd-9ac2-ba6fd39c7869" (UID: "edb2391c-fe2c-49bd-9ac2-ba6fd39c7869"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:33:24.650931 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:24.650907 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-kube-api-access-5jjfs" (OuterVolumeSpecName: "kube-api-access-5jjfs") pod "edb2391c-fe2c-49bd-9ac2-ba6fd39c7869" (UID: "edb2391c-fe2c-49bd-9ac2-ba6fd39c7869"). InnerVolumeSpecName "kube-api-access-5jjfs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:33:24.652751 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:24.652728 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-util" (OuterVolumeSpecName: "util") pod "edb2391c-fe2c-49bd-9ac2-ba6fd39c7869" (UID: "edb2391c-fe2c-49bd-9ac2-ba6fd39c7869"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:33:24.749437 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:24.749351 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5jjfs\" (UniqueName: \"kubernetes.io/projected/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-kube-api-access-5jjfs\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:33:24.749437 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:24.749382 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-util\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:33:24.749437 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:24.749392 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edb2391c-fe2c-49bd-9ac2-ba6fd39c7869-bundle\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:33:25.478810 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:25.478776 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" event={"ID":"edb2391c-fe2c-49bd-9ac2-ba6fd39c7869","Type":"ContainerDied","Data":"705f901f3040edbcc3a8e55bf59bd2e6f97aa25b4a0393484c03448c2d588610"} Apr 16 20:33:25.478810 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:25.478807 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rz79m" Apr 16 20:33:25.478999 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:25.478809 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="705f901f3040edbcc3a8e55bf59bd2e6f97aa25b4a0393484c03448c2d588610" Apr 16 20:33:30.354163 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.354129 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-42sdj"] Apr 16 20:33:30.354634 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.354443 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edb2391c-fe2c-49bd-9ac2-ba6fd39c7869" containerName="util" Apr 16 20:33:30.354634 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.354454 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb2391c-fe2c-49bd-9ac2-ba6fd39c7869" containerName="util" Apr 16 20:33:30.354634 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.354487 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edb2391c-fe2c-49bd-9ac2-ba6fd39c7869" containerName="extract" Apr 16 20:33:30.354634 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.354496 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb2391c-fe2c-49bd-9ac2-ba6fd39c7869" containerName="extract" Apr 16 20:33:30.354634 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.354508 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edb2391c-fe2c-49bd-9ac2-ba6fd39c7869" containerName="pull" Apr 16 20:33:30.354634 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.354514 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb2391c-fe2c-49bd-9ac2-ba6fd39c7869" containerName="pull" Apr 16 20:33:30.354634 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.354566 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="edb2391c-fe2c-49bd-9ac2-ba6fd39c7869" containerName="extract" Apr 16 20:33:30.359066 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.359046 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-42sdj" Apr 16 20:33:30.361500 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.361473 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:33:30.361619 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.361484 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-gjnrp\"" Apr 16 20:33:30.361619 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.361585 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 20:33:30.368819 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.368798 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-42sdj"] Apr 16 20:33:30.494090 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.494060 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2h2k\" (UniqueName: \"kubernetes.io/projected/7c55c34e-e90b-4553-9d12-c7e2c143fc1b-kube-api-access-z2h2k\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-42sdj\" (UID: \"7c55c34e-e90b-4553-9d12-c7e2c143fc1b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-42sdj" Apr 16 20:33:30.494090 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.494093 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7c55c34e-e90b-4553-9d12-c7e2c143fc1b-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-42sdj\" (UID: \"7c55c34e-e90b-4553-9d12-c7e2c143fc1b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-42sdj" Apr 16 20:33:30.594777 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.594744 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2h2k\" (UniqueName: \"kubernetes.io/projected/7c55c34e-e90b-4553-9d12-c7e2c143fc1b-kube-api-access-z2h2k\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-42sdj\" (UID: \"7c55c34e-e90b-4553-9d12-c7e2c143fc1b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-42sdj" Apr 16 20:33:30.594777 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.594783 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7c55c34e-e90b-4553-9d12-c7e2c143fc1b-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-42sdj\" (UID: \"7c55c34e-e90b-4553-9d12-c7e2c143fc1b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-42sdj" Apr 16 20:33:30.595118 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.595104 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7c55c34e-e90b-4553-9d12-c7e2c143fc1b-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-42sdj\" (UID: \"7c55c34e-e90b-4553-9d12-c7e2c143fc1b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-42sdj" Apr 16 20:33:30.603508 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.603482 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2h2k\" (UniqueName: \"kubernetes.io/projected/7c55c34e-e90b-4553-9d12-c7e2c143fc1b-kube-api-access-z2h2k\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-42sdj\" (UID: \"7c55c34e-e90b-4553-9d12-c7e2c143fc1b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-42sdj" Apr 16 20:33:30.668257 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.668238 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-42sdj" Apr 16 20:33:30.787436 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:30.787400 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-42sdj"] Apr 16 20:33:30.791530 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:33:30.791504 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c55c34e_e90b_4553_9d12_c7e2c143fc1b.slice/crio-95987f356ed130ddce0c36d83ef6531fe95474697a5ab14a22004b82c0083726 WatchSource:0}: Error finding container 95987f356ed130ddce0c36d83ef6531fe95474697a5ab14a22004b82c0083726: Status 404 returned error can't find the container with id 95987f356ed130ddce0c36d83ef6531fe95474697a5ab14a22004b82c0083726 Apr 16 20:33:31.497393 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:31.497357 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-42sdj" event={"ID":"7c55c34e-e90b-4553-9d12-c7e2c143fc1b","Type":"ContainerStarted","Data":"95987f356ed130ddce0c36d83ef6531fe95474697a5ab14a22004b82c0083726"} Apr 16 20:33:33.511417 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:33.511374 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-42sdj" event={"ID":"7c55c34e-e90b-4553-9d12-c7e2c143fc1b","Type":"ContainerStarted","Data":"9dcf26f23afd9f3ba5b6a81161f5e9c22b931b14cf42f6ad4cae9e86f3da2996"} Apr 16 20:33:33.533574 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:33.533522 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-42sdj" podStartSLOduration=1.458041114 podStartE2EDuration="3.53350534s" podCreationTimestamp="2026-04-16 20:33:30 +0000 UTC" firstStartedPulling="2026-04-16 20:33:30.793829354 +0000 UTC m=+358.062749743" lastFinishedPulling="2026-04-16 20:33:32.869293578 +0000 UTC m=+360.138213969" observedRunningTime="2026-04-16 20:33:33.531536122 +0000 UTC m=+360.800456531" watchObservedRunningTime="2026-04-16 20:33:33.53350534 +0000 UTC m=+360.802425804" Apr 16 20:33:34.981591 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:34.981556 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs"] Apr 16 20:33:34.985063 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:34.985048 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" Apr 16 20:33:34.987890 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:34.987870 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:33:34.988016 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:34.987985 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5kphd\"" Apr 16 20:33:34.988963 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:34.988948 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:33:34.993618 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:34.993594 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs"] Apr 16 20:33:35.132550 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.132515 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7f458df-e378-4878-8a2b-281726f325c0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs\" (UID: \"b7f458df-e378-4878-8a2b-281726f325c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" Apr 16 20:33:35.132731 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.132559 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7f458df-e378-4878-8a2b-281726f325c0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs\" (UID: \"b7f458df-e378-4878-8a2b-281726f325c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" Apr 16 20:33:35.132731 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.132583 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68n5p\" (UniqueName: \"kubernetes.io/projected/b7f458df-e378-4878-8a2b-281726f325c0-kube-api-access-68n5p\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs\" (UID: \"b7f458df-e378-4878-8a2b-281726f325c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" Apr 16 20:33:35.234001 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.233908 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7f458df-e378-4878-8a2b-281726f325c0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs\" (UID: \"b7f458df-e378-4878-8a2b-281726f325c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" Apr 16 20:33:35.234001 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.233961 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7f458df-e378-4878-8a2b-281726f325c0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs\" (UID: \"b7f458df-e378-4878-8a2b-281726f325c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" Apr 16 20:33:35.234219 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.234076 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68n5p\" (UniqueName: \"kubernetes.io/projected/b7f458df-e378-4878-8a2b-281726f325c0-kube-api-access-68n5p\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs\" (UID: \"b7f458df-e378-4878-8a2b-281726f325c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" Apr 16 20:33:35.234285 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.234271 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7f458df-e378-4878-8a2b-281726f325c0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs\" (UID: \"b7f458df-e378-4878-8a2b-281726f325c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" Apr 16 20:33:35.234352 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.234315 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7f458df-e378-4878-8a2b-281726f325c0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs\" (UID: \"b7f458df-e378-4878-8a2b-281726f325c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" Apr 16 20:33:35.243018 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.242996 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68n5p\" (UniqueName: \"kubernetes.io/projected/b7f458df-e378-4878-8a2b-281726f325c0-kube-api-access-68n5p\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs\" (UID: \"b7f458df-e378-4878-8a2b-281726f325c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" Apr 16 20:33:35.294344 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.294309 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" Apr 16 20:33:35.436758 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.436713 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs"] Apr 16 20:33:35.441374 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:33:35.441332 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7f458df_e378_4878_8a2b_281726f325c0.slice/crio-f722f331f7ff57cecf28b6cfe6f22ca9799b6c2b2925e59a11f87fb2e4a57ee5 WatchSource:0}: Error finding container f722f331f7ff57cecf28b6cfe6f22ca9799b6c2b2925e59a11f87fb2e4a57ee5: Status 404 returned error can't find the container with id f722f331f7ff57cecf28b6cfe6f22ca9799b6c2b2925e59a11f87fb2e4a57ee5 Apr 16 20:33:35.519390 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.519358 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" event={"ID":"b7f458df-e378-4878-8a2b-281726f325c0","Type":"ContainerStarted","Data":"f5377575b63f5c00968c34a7ca99585a4f205af99dcc97387f3e0dbe9b08e158"} Apr 16 20:33:35.519390 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.519395 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" event={"ID":"b7f458df-e378-4878-8a2b-281726f325c0","Type":"ContainerStarted","Data":"f722f331f7ff57cecf28b6cfe6f22ca9799b6c2b2925e59a11f87fb2e4a57ee5"} Apr 16 20:33:35.620769 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.620736 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-rn6ws"] Apr 16 20:33:35.623804 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.623783 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-rn6ws" Apr 16 20:33:35.626439 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.626418 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 20:33:35.626556 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.626540 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 20:33:35.626624 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.626598 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-k28t2\"" Apr 16 20:33:35.634611 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.634587 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-rn6ws"] Apr 16 20:33:35.739003 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.738965 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62e694b6-4333-4cef-a8c5-df39e8b950d4-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-rn6ws\" (UID: \"62e694b6-4333-4cef-a8c5-df39e8b950d4\") " pod="cert-manager/cert-manager-webhook-597b96b99b-rn6ws" Apr 16 20:33:35.739003 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.739003 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqk4v\" (UniqueName: \"kubernetes.io/projected/62e694b6-4333-4cef-a8c5-df39e8b950d4-kube-api-access-lqk4v\") pod \"cert-manager-webhook-597b96b99b-rn6ws\" (UID: \"62e694b6-4333-4cef-a8c5-df39e8b950d4\") " pod="cert-manager/cert-manager-webhook-597b96b99b-rn6ws" Apr 16 20:33:35.839989 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.839907 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62e694b6-4333-4cef-a8c5-df39e8b950d4-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-rn6ws\" (UID: \"62e694b6-4333-4cef-a8c5-df39e8b950d4\") " pod="cert-manager/cert-manager-webhook-597b96b99b-rn6ws" Apr 16 20:33:35.839989 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.839939 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqk4v\" (UniqueName: \"kubernetes.io/projected/62e694b6-4333-4cef-a8c5-df39e8b950d4-kube-api-access-lqk4v\") pod \"cert-manager-webhook-597b96b99b-rn6ws\" (UID: \"62e694b6-4333-4cef-a8c5-df39e8b950d4\") " pod="cert-manager/cert-manager-webhook-597b96b99b-rn6ws" Apr 16 20:33:35.848495 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.848449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqk4v\" (UniqueName: \"kubernetes.io/projected/62e694b6-4333-4cef-a8c5-df39e8b950d4-kube-api-access-lqk4v\") pod \"cert-manager-webhook-597b96b99b-rn6ws\" (UID: \"62e694b6-4333-4cef-a8c5-df39e8b950d4\") " pod="cert-manager/cert-manager-webhook-597b96b99b-rn6ws" Apr 16 20:33:35.848606 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.848565 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62e694b6-4333-4cef-a8c5-df39e8b950d4-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-rn6ws\" (UID: \"62e694b6-4333-4cef-a8c5-df39e8b950d4\") " pod="cert-manager/cert-manager-webhook-597b96b99b-rn6ws" Apr 16 20:33:35.944247 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:35.944208 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-rn6ws" Apr 16 20:33:36.065808 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:36.065782 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-rn6ws"] Apr 16 20:33:36.068436 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:33:36.068410 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62e694b6_4333_4cef_a8c5_df39e8b950d4.slice/crio-ce91d3b66289e26b5387d465e406d2fcff2fbb6b34df76a5cb0428ebe30b5f23 WatchSource:0}: Error finding container ce91d3b66289e26b5387d465e406d2fcff2fbb6b34df76a5cb0428ebe30b5f23: Status 404 returned error can't find the container with id ce91d3b66289e26b5387d465e406d2fcff2fbb6b34df76a5cb0428ebe30b5f23 Apr 16 20:33:36.524160 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:36.524074 2577 generic.go:358] "Generic (PLEG): container finished" podID="b7f458df-e378-4878-8a2b-281726f325c0" containerID="f5377575b63f5c00968c34a7ca99585a4f205af99dcc97387f3e0dbe9b08e158" exitCode=0 Apr 16 20:33:36.524160 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:36.524147 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" event={"ID":"b7f458df-e378-4878-8a2b-281726f325c0","Type":"ContainerDied","Data":"f5377575b63f5c00968c34a7ca99585a4f205af99dcc97387f3e0dbe9b08e158"} Apr 16 20:33:36.525292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:36.525265 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-rn6ws" event={"ID":"62e694b6-4333-4cef-a8c5-df39e8b950d4","Type":"ContainerStarted","Data":"ce91d3b66289e26b5387d465e406d2fcff2fbb6b34df76a5cb0428ebe30b5f23"} Apr 16 20:33:40.542250 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:40.542216 2577 generic.go:358] "Generic (PLEG): container finished" podID="b7f458df-e378-4878-8a2b-281726f325c0" containerID="e3f72c26a4511c196762c2748733567b4a5fe6b90607b52f64afe0cb0e517ec4" exitCode=0 Apr 16 20:33:40.542776 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:40.542307 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" event={"ID":"b7f458df-e378-4878-8a2b-281726f325c0","Type":"ContainerDied","Data":"e3f72c26a4511c196762c2748733567b4a5fe6b90607b52f64afe0cb0e517ec4"} Apr 16 20:33:40.543709 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:40.543684 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-rn6ws" event={"ID":"62e694b6-4333-4cef-a8c5-df39e8b950d4","Type":"ContainerStarted","Data":"62de9c550cafe502813b489d5e46369ce615bad901f203507f909b26b62ae49a"} Apr 16 20:33:40.543827 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:40.543814 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-rn6ws" Apr 16 20:33:40.580044 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:40.580003 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-rn6ws" podStartSLOduration=1.941617828 podStartE2EDuration="5.579988863s" podCreationTimestamp="2026-04-16 20:33:35 +0000 UTC" firstStartedPulling="2026-04-16 20:33:36.070365135 +0000 UTC m=+363.339285524" lastFinishedPulling="2026-04-16 20:33:39.708736111 +0000 UTC m=+366.977656559" observedRunningTime="2026-04-16 20:33:40.578806375 +0000 UTC m=+367.847726786" watchObservedRunningTime="2026-04-16 20:33:40.579988863 +0000 UTC m=+367.848909272" Apr 16 20:33:41.548856 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:41.548825 2577 generic.go:358] "Generic (PLEG): container finished" podID="b7f458df-e378-4878-8a2b-281726f325c0" containerID="028b3f125702a4f62532b77bddc494e8176d8120f0d3670e935fbcbeb1bb6437" exitCode=0 Apr 16 20:33:41.549300 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:41.548911 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" event={"ID":"b7f458df-e378-4878-8a2b-281726f325c0","Type":"ContainerDied","Data":"028b3f125702a4f62532b77bddc494e8176d8120f0d3670e935fbcbeb1bb6437"} Apr 16 20:33:42.671748 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:42.671726 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" Apr 16 20:33:42.801809 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:42.801775 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7f458df-e378-4878-8a2b-281726f325c0-util\") pod \"b7f458df-e378-4878-8a2b-281726f325c0\" (UID: \"b7f458df-e378-4878-8a2b-281726f325c0\") " Apr 16 20:33:42.802002 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:42.801840 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7f458df-e378-4878-8a2b-281726f325c0-bundle\") pod \"b7f458df-e378-4878-8a2b-281726f325c0\" (UID: \"b7f458df-e378-4878-8a2b-281726f325c0\") " Apr 16 20:33:42.802002 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:42.801864 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68n5p\" (UniqueName: \"kubernetes.io/projected/b7f458df-e378-4878-8a2b-281726f325c0-kube-api-access-68n5p\") pod \"b7f458df-e378-4878-8a2b-281726f325c0\" (UID: \"b7f458df-e378-4878-8a2b-281726f325c0\") " Apr 16 20:33:42.802234 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:42.802208 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7f458df-e378-4878-8a2b-281726f325c0-bundle" (OuterVolumeSpecName: "bundle") pod "b7f458df-e378-4878-8a2b-281726f325c0" (UID: "b7f458df-e378-4878-8a2b-281726f325c0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:33:42.803945 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:42.803891 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7f458df-e378-4878-8a2b-281726f325c0-kube-api-access-68n5p" (OuterVolumeSpecName: "kube-api-access-68n5p") pod "b7f458df-e378-4878-8a2b-281726f325c0" (UID: "b7f458df-e378-4878-8a2b-281726f325c0"). InnerVolumeSpecName "kube-api-access-68n5p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:33:42.805703 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:42.805674 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7f458df-e378-4878-8a2b-281726f325c0-util" (OuterVolumeSpecName: "util") pod "b7f458df-e378-4878-8a2b-281726f325c0" (UID: "b7f458df-e378-4878-8a2b-281726f325c0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:33:42.902851 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:42.902812 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7f458df-e378-4878-8a2b-281726f325c0-util\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:33:42.902851 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:42.902848 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7f458df-e378-4878-8a2b-281726f325c0-bundle\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:33:42.902851 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:42.902860 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-68n5p\" (UniqueName: \"kubernetes.io/projected/b7f458df-e378-4878-8a2b-281726f325c0-kube-api-access-68n5p\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:33:43.557742 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:43.557661 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" event={"ID":"b7f458df-e378-4878-8a2b-281726f325c0","Type":"ContainerDied","Data":"f722f331f7ff57cecf28b6cfe6f22ca9799b6c2b2925e59a11f87fb2e4a57ee5"} Apr 16 20:33:43.557742 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:43.557705 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f722f331f7ff57cecf28b6cfe6f22ca9799b6c2b2925e59a11f87fb2e4a57ee5" Apr 16 20:33:43.557742 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:43.557738 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpb5vs" Apr 16 20:33:46.551602 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:46.551572 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-rn6ws" Apr 16 20:33:56.979366 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:56.979328 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7"] Apr 16 20:33:56.979793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:56.979672 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7f458df-e378-4878-8a2b-281726f325c0" containerName="pull" Apr 16 20:33:56.979793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:56.979684 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7f458df-e378-4878-8a2b-281726f325c0" containerName="pull" Apr 16 20:33:56.979793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:56.979692 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7f458df-e378-4878-8a2b-281726f325c0" containerName="util" Apr 16 20:33:56.979793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:56.979698 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7f458df-e378-4878-8a2b-281726f325c0" containerName="util" Apr 16 20:33:56.979793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:56.979704 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7f458df-e378-4878-8a2b-281726f325c0" containerName="extract" Apr 16 20:33:56.979793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:56.979709 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7f458df-e378-4878-8a2b-281726f325c0" containerName="extract" Apr 16 20:33:56.979793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:56.979762 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b7f458df-e378-4878-8a2b-281726f325c0" containerName="extract" Apr 16 20:33:56.983688 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:56.983670 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" Apr 16 20:33:56.986189 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:56.986157 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5kphd\"" Apr 16 20:33:56.986307 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:56.986216 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:33:56.986307 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:56.986232 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:33:56.991312 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:56.990709 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7"] Apr 16 20:33:57.119593 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:57.119555 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c20d99c-199c-4048-b0b6-6e98c0755cfd-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7\" (UID: \"4c20d99c-199c-4048-b0b6-6e98c0755cfd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" Apr 16 20:33:57.119593 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:57.119601 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lbvw\" (UniqueName: \"kubernetes.io/projected/4c20d99c-199c-4048-b0b6-6e98c0755cfd-kube-api-access-4lbvw\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7\" (UID: \"4c20d99c-199c-4048-b0b6-6e98c0755cfd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" Apr 16 20:33:57.119811 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:57.119633 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c20d99c-199c-4048-b0b6-6e98c0755cfd-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7\" (UID: \"4c20d99c-199c-4048-b0b6-6e98c0755cfd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" Apr 16 20:33:57.220723 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:57.220688 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c20d99c-199c-4048-b0b6-6e98c0755cfd-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7\" (UID: \"4c20d99c-199c-4048-b0b6-6e98c0755cfd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" Apr 16 20:33:57.220723 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:57.220727 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lbvw\" (UniqueName: \"kubernetes.io/projected/4c20d99c-199c-4048-b0b6-6e98c0755cfd-kube-api-access-4lbvw\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7\" (UID: \"4c20d99c-199c-4048-b0b6-6e98c0755cfd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" Apr 16 20:33:57.220951 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:57.220852 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c20d99c-199c-4048-b0b6-6e98c0755cfd-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7\" (UID: \"4c20d99c-199c-4048-b0b6-6e98c0755cfd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" Apr 16 20:33:57.221075 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:57.221058 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c20d99c-199c-4048-b0b6-6e98c0755cfd-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7\" (UID: \"4c20d99c-199c-4048-b0b6-6e98c0755cfd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" Apr 16 20:33:57.221136 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:57.221120 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c20d99c-199c-4048-b0b6-6e98c0755cfd-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7\" (UID: \"4c20d99c-199c-4048-b0b6-6e98c0755cfd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" Apr 16 20:33:57.228279 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:57.228251 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lbvw\" (UniqueName: \"kubernetes.io/projected/4c20d99c-199c-4048-b0b6-6e98c0755cfd-kube-api-access-4lbvw\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7\" (UID: \"4c20d99c-199c-4048-b0b6-6e98c0755cfd\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" Apr 16 20:33:57.293911 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:57.293847 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" Apr 16 20:33:57.409231 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:57.409179 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7"] Apr 16 20:33:57.411915 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:33:57.411885 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c20d99c_199c_4048_b0b6_6e98c0755cfd.slice/crio-58aca8442baf6c805eaf2ef1e5163947dbe48a60cd2a2da20938beca34115936 WatchSource:0}: Error finding container 58aca8442baf6c805eaf2ef1e5163947dbe48a60cd2a2da20938beca34115936: Status 404 returned error can't find the container with id 58aca8442baf6c805eaf2ef1e5163947dbe48a60cd2a2da20938beca34115936 Apr 16 20:33:57.600308 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:57.600221 2577 generic.go:358] "Generic (PLEG): container finished" podID="4c20d99c-199c-4048-b0b6-6e98c0755cfd" containerID="0defa0b6be33cbb3243e8121b88ebc3a6fc57974dcab090f0d678d822515c754" exitCode=0 Apr 16 20:33:57.600501 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:57.600297 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" event={"ID":"4c20d99c-199c-4048-b0b6-6e98c0755cfd","Type":"ContainerDied","Data":"0defa0b6be33cbb3243e8121b88ebc3a6fc57974dcab090f0d678d822515c754"} Apr 16 20:33:57.600501 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:57.600344 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" event={"ID":"4c20d99c-199c-4048-b0b6-6e98c0755cfd","Type":"ContainerStarted","Data":"58aca8442baf6c805eaf2ef1e5163947dbe48a60cd2a2da20938beca34115936"} Apr 16 20:33:58.605369 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:58.605276 2577 generic.go:358] "Generic (PLEG): container finished" podID="4c20d99c-199c-4048-b0b6-6e98c0755cfd" containerID="10e05d8d4f54780f4babbd7755e5cadd84864d8b6c5b7f000101d3b2449d0a31" exitCode=0 Apr 16 20:33:58.605812 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:58.605366 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" event={"ID":"4c20d99c-199c-4048-b0b6-6e98c0755cfd","Type":"ContainerDied","Data":"10e05d8d4f54780f4babbd7755e5cadd84864d8b6c5b7f000101d3b2449d0a31"} Apr 16 20:33:59.610132 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:59.610100 2577 generic.go:358] "Generic (PLEG): container finished" podID="4c20d99c-199c-4048-b0b6-6e98c0755cfd" containerID="7bf4e3e777f2a4309159c8b773606fca963e5d99296fe79e7266b758c12e12e9" exitCode=0 Apr 16 20:33:59.610519 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:33:59.610167 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" event={"ID":"4c20d99c-199c-4048-b0b6-6e98c0755cfd","Type":"ContainerDied","Data":"7bf4e3e777f2a4309159c8b773606fca963e5d99296fe79e7266b758c12e12e9"} Apr 16 20:34:00.733111 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:00.733090 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" Apr 16 20:34:00.748352 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:00.748311 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lbvw\" (UniqueName: \"kubernetes.io/projected/4c20d99c-199c-4048-b0b6-6e98c0755cfd-kube-api-access-4lbvw\") pod \"4c20d99c-199c-4048-b0b6-6e98c0755cfd\" (UID: \"4c20d99c-199c-4048-b0b6-6e98c0755cfd\") " Apr 16 20:34:00.748352 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:00.748346 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c20d99c-199c-4048-b0b6-6e98c0755cfd-bundle\") pod \"4c20d99c-199c-4048-b0b6-6e98c0755cfd\" (UID: \"4c20d99c-199c-4048-b0b6-6e98c0755cfd\") " Apr 16 20:34:00.748528 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:00.748394 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c20d99c-199c-4048-b0b6-6e98c0755cfd-util\") pod \"4c20d99c-199c-4048-b0b6-6e98c0755cfd\" (UID: \"4c20d99c-199c-4048-b0b6-6e98c0755cfd\") " Apr 16 20:34:00.749621 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:00.749592 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c20d99c-199c-4048-b0b6-6e98c0755cfd-bundle" (OuterVolumeSpecName: "bundle") pod "4c20d99c-199c-4048-b0b6-6e98c0755cfd" (UID: "4c20d99c-199c-4048-b0b6-6e98c0755cfd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:00.751344 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:00.751315 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c20d99c-199c-4048-b0b6-6e98c0755cfd-kube-api-access-4lbvw" (OuterVolumeSpecName: "kube-api-access-4lbvw") pod "4c20d99c-199c-4048-b0b6-6e98c0755cfd" (UID: "4c20d99c-199c-4048-b0b6-6e98c0755cfd"). InnerVolumeSpecName "kube-api-access-4lbvw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:34:00.757397 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:00.757367 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c20d99c-199c-4048-b0b6-6e98c0755cfd-util" (OuterVolumeSpecName: "util") pod "4c20d99c-199c-4048-b0b6-6e98c0755cfd" (UID: "4c20d99c-199c-4048-b0b6-6e98c0755cfd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:00.849830 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:00.849785 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4lbvw\" (UniqueName: \"kubernetes.io/projected/4c20d99c-199c-4048-b0b6-6e98c0755cfd-kube-api-access-4lbvw\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:34:00.849830 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:00.849824 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c20d99c-199c-4048-b0b6-6e98c0755cfd-bundle\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:34:00.849830 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:00.849834 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c20d99c-199c-4048-b0b6-6e98c0755cfd-util\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:34:01.618268 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:01.618192 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" Apr 16 20:34:01.618404 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:01.618187 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5rxsn7" event={"ID":"4c20d99c-199c-4048-b0b6-6e98c0755cfd","Type":"ContainerDied","Data":"58aca8442baf6c805eaf2ef1e5163947dbe48a60cd2a2da20938beca34115936"} Apr 16 20:34:01.618404 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:01.618292 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58aca8442baf6c805eaf2ef1e5163947dbe48a60cd2a2da20938beca34115936" Apr 16 20:34:11.456440 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.456395 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2"] Apr 16 20:34:11.456927 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.456824 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c20d99c-199c-4048-b0b6-6e98c0755cfd" containerName="extract" Apr 16 20:34:11.456927 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.456842 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c20d99c-199c-4048-b0b6-6e98c0755cfd" containerName="extract" Apr 16 20:34:11.456927 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.456855 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c20d99c-199c-4048-b0b6-6e98c0755cfd" containerName="util" Apr 16 20:34:11.456927 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.456862 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c20d99c-199c-4048-b0b6-6e98c0755cfd" containerName="util" Apr 16 20:34:11.456927 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.456895 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c20d99c-199c-4048-b0b6-6e98c0755cfd" containerName="pull" Apr 16 20:34:11.456927 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.456903 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c20d99c-199c-4048-b0b6-6e98c0755cfd" containerName="pull" Apr 16 20:34:11.457215 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.456976 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c20d99c-199c-4048-b0b6-6e98c0755cfd" containerName="extract" Apr 16 20:34:11.459910 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.459889 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" Apr 16 20:34:11.463662 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.463642 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:34:11.463863 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.463846 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:34:11.464794 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.464771 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5kphd\"" Apr 16 20:34:11.476263 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.476243 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2"] Apr 16 20:34:11.538120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.538088 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7ebe6d28-f518-4718-b16d-77682681699a-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2\" (UID: \"7ebe6d28-f518-4718-b16d-77682681699a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" Apr 16 20:34:11.538279 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.538140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7ebe6d28-f518-4718-b16d-77682681699a-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2\" (UID: \"7ebe6d28-f518-4718-b16d-77682681699a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" Apr 16 20:34:11.538279 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.538198 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mks6d\" (UniqueName: \"kubernetes.io/projected/7ebe6d28-f518-4718-b16d-77682681699a-kube-api-access-mks6d\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2\" (UID: \"7ebe6d28-f518-4718-b16d-77682681699a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" Apr 16 20:34:11.639352 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.639316 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7ebe6d28-f518-4718-b16d-77682681699a-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2\" (UID: \"7ebe6d28-f518-4718-b16d-77682681699a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" Apr 16 20:34:11.639535 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.639361 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mks6d\" (UniqueName: \"kubernetes.io/projected/7ebe6d28-f518-4718-b16d-77682681699a-kube-api-access-mks6d\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2\" (UID: \"7ebe6d28-f518-4718-b16d-77682681699a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" Apr 16 20:34:11.639535 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.639401 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7ebe6d28-f518-4718-b16d-77682681699a-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2\" (UID: \"7ebe6d28-f518-4718-b16d-77682681699a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" Apr 16 20:34:11.639808 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.639791 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7ebe6d28-f518-4718-b16d-77682681699a-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2\" (UID: \"7ebe6d28-f518-4718-b16d-77682681699a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" Apr 16 20:34:11.639841 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.639827 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7ebe6d28-f518-4718-b16d-77682681699a-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2\" (UID: \"7ebe6d28-f518-4718-b16d-77682681699a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" Apr 16 20:34:11.685289 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.685248 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mks6d\" (UniqueName: \"kubernetes.io/projected/7ebe6d28-f518-4718-b16d-77682681699a-kube-api-access-mks6d\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2\" (UID: \"7ebe6d28-f518-4718-b16d-77682681699a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" Apr 16 20:34:11.698956 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.698930 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2"] Apr 16 20:34:11.701135 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.701117 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2" Apr 16 20:34:11.709035 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.709013 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 20:34:11.709137 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.709067 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 20:34:11.714081 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.714061 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 20:34:11.714180 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.714124 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 20:34:11.714390 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.714368 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-k9btq\"" Apr 16 20:34:11.729133 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.729104 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2"] Apr 16 20:34:11.740729 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.740708 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xzq8\" (UniqueName: \"kubernetes.io/projected/0d2e4714-1583-4a94-967a-9e28c0789279-kube-api-access-9xzq8\") pod \"opendatahub-operator-controller-manager-6cc777b675-b9vl2\" (UID: \"0d2e4714-1583-4a94-967a-9e28c0789279\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2" Apr 16 20:34:11.740838 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.740782 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d2e4714-1583-4a94-967a-9e28c0789279-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-b9vl2\" (UID: \"0d2e4714-1583-4a94-967a-9e28c0789279\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2" Apr 16 20:34:11.740838 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.740804 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d2e4714-1583-4a94-967a-9e28c0789279-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-b9vl2\" (UID: \"0d2e4714-1583-4a94-967a-9e28c0789279\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2" Apr 16 20:34:11.769255 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.769230 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" Apr 16 20:34:11.841476 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.841377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d2e4714-1583-4a94-967a-9e28c0789279-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-b9vl2\" (UID: \"0d2e4714-1583-4a94-967a-9e28c0789279\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2" Apr 16 20:34:11.841476 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.841414 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d2e4714-1583-4a94-967a-9e28c0789279-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-b9vl2\" (UID: \"0d2e4714-1583-4a94-967a-9e28c0789279\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2" Apr 16 20:34:11.841476 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.841451 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xzq8\" (UniqueName: \"kubernetes.io/projected/0d2e4714-1583-4a94-967a-9e28c0789279-kube-api-access-9xzq8\") pod \"opendatahub-operator-controller-manager-6cc777b675-b9vl2\" (UID: \"0d2e4714-1583-4a94-967a-9e28c0789279\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2" Apr 16 20:34:11.843927 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.843903 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d2e4714-1583-4a94-967a-9e28c0789279-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-b9vl2\" (UID: \"0d2e4714-1583-4a94-967a-9e28c0789279\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2" Apr 16 20:34:11.844030 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.843929 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d2e4714-1583-4a94-967a-9e28c0789279-webhook-cert\") pod \"opendatahub-operator-controller-manager-6cc777b675-b9vl2\" (UID: \"0d2e4714-1583-4a94-967a-9e28c0789279\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2" Apr 16 20:34:11.852140 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.852087 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xzq8\" (UniqueName: \"kubernetes.io/projected/0d2e4714-1583-4a94-967a-9e28c0789279-kube-api-access-9xzq8\") pod \"opendatahub-operator-controller-manager-6cc777b675-b9vl2\" (UID: \"0d2e4714-1583-4a94-967a-9e28c0789279\") " pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2" Apr 16 20:34:11.896038 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:11.896016 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2"] Apr 16 20:34:11.898102 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:34:11.898071 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ebe6d28_f518_4718_b16d_77682681699a.slice/crio-60ab7881af2b34ed056a56a8ec8b65b3984b9e034d3044f6c9089199e3bb5f0e WatchSource:0}: Error finding container 60ab7881af2b34ed056a56a8ec8b65b3984b9e034d3044f6c9089199e3bb5f0e: Status 404 returned error can't find the container with id 60ab7881af2b34ed056a56a8ec8b65b3984b9e034d3044f6c9089199e3bb5f0e Apr 16 20:34:12.011242 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:12.011219 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2" Apr 16 20:34:12.154618 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:12.154593 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2"] Apr 16 20:34:12.157036 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:34:12.157007 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d2e4714_1583_4a94_967a_9e28c0789279.slice/crio-2e204dce4dbfb372a3157d938a2fb30ff94052a2ad66c5cf3d034563414b78f6 WatchSource:0}: Error finding container 2e204dce4dbfb372a3157d938a2fb30ff94052a2ad66c5cf3d034563414b78f6: Status 404 returned error can't find the container with id 2e204dce4dbfb372a3157d938a2fb30ff94052a2ad66c5cf3d034563414b78f6 Apr 16 20:34:12.655417 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:12.655378 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2" event={"ID":"0d2e4714-1583-4a94-967a-9e28c0789279","Type":"ContainerStarted","Data":"2e204dce4dbfb372a3157d938a2fb30ff94052a2ad66c5cf3d034563414b78f6"} Apr 16 20:34:12.656874 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:12.656844 2577 generic.go:358] "Generic (PLEG): container finished" podID="7ebe6d28-f518-4718-b16d-77682681699a" containerID="3dae27bc2e4c5a52ed7c20d176caaf5f75c1cfa89827b177e136024a2970b9ac" exitCode=0 Apr 16 20:34:12.657014 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:12.656946 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" event={"ID":"7ebe6d28-f518-4718-b16d-77682681699a","Type":"ContainerDied","Data":"3dae27bc2e4c5a52ed7c20d176caaf5f75c1cfa89827b177e136024a2970b9ac"} Apr 16 20:34:12.657014 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:12.656983 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" event={"ID":"7ebe6d28-f518-4718-b16d-77682681699a","Type":"ContainerStarted","Data":"60ab7881af2b34ed056a56a8ec8b65b3984b9e034d3044f6c9089199e3bb5f0e"} Apr 16 20:34:15.670624 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:15.670589 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2" event={"ID":"0d2e4714-1583-4a94-967a-9e28c0789279","Type":"ContainerStarted","Data":"995a2a56868f74116f42ca5e145e19d60cc909c11bbb0feda8fdeef06330b247"} Apr 16 20:34:15.671041 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:15.670695 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2" Apr 16 20:34:15.672184 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:15.672161 2577 generic.go:358] "Generic (PLEG): container finished" podID="7ebe6d28-f518-4718-b16d-77682681699a" containerID="598889f5bd27ab5aa1b410a86247d3f5e00962b7be11daa931a02f699bc49f5e" exitCode=0 Apr 16 20:34:15.672248 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:15.672219 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" event={"ID":"7ebe6d28-f518-4718-b16d-77682681699a","Type":"ContainerDied","Data":"598889f5bd27ab5aa1b410a86247d3f5e00962b7be11daa931a02f699bc49f5e"} Apr 16 20:34:15.693560 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:15.693517 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2" podStartSLOduration=2.244020628 podStartE2EDuration="4.693504402s" podCreationTimestamp="2026-04-16 20:34:11 +0000 UTC" firstStartedPulling="2026-04-16 20:34:12.158819675 +0000 UTC m=+399.427740068" lastFinishedPulling="2026-04-16 20:34:14.608303449 +0000 UTC m=+401.877223842" observedRunningTime="2026-04-16 20:34:15.691202861 +0000 UTC m=+402.960123275" watchObservedRunningTime="2026-04-16 20:34:15.693504402 +0000 UTC m=+402.962424809" Apr 16 20:34:16.677587 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:16.677552 2577 generic.go:358] "Generic (PLEG): container finished" podID="7ebe6d28-f518-4718-b16d-77682681699a" containerID="fec84957d9a3f94ae6de72c552c048957f7211fecbef9b12104f9cee768fd9ef" exitCode=0 Apr 16 20:34:16.677959 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:16.677625 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" event={"ID":"7ebe6d28-f518-4718-b16d-77682681699a","Type":"ContainerDied","Data":"fec84957d9a3f94ae6de72c552c048957f7211fecbef9b12104f9cee768fd9ef"} Apr 16 20:34:17.534534 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.534497 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98"] Apr 16 20:34:17.537184 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.537163 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" Apr 16 20:34:17.540312 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.540291 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 20:34:17.541424 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.541405 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 20:34:17.541525 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.541425 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-drdrr\"" Apr 16 20:34:17.541525 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.541499 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 20:34:17.541640 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.541543 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:34:17.541640 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.541557 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 20:34:17.551072 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.551053 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98"] Apr 16 20:34:17.586589 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.586559 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d7128bde-4264-4df6-b066-f6f9e789cd5f-manager-config\") pod \"lws-controller-manager-7b555bff64-z6t98\" (UID: \"d7128bde-4264-4df6-b066-f6f9e789cd5f\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" Apr 16 20:34:17.586589 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.586591 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7128bde-4264-4df6-b066-f6f9e789cd5f-metrics-cert\") pod \"lws-controller-manager-7b555bff64-z6t98\" (UID: \"d7128bde-4264-4df6-b066-f6f9e789cd5f\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" Apr 16 20:34:17.586734 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.586630 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7128bde-4264-4df6-b066-f6f9e789cd5f-cert\") pod \"lws-controller-manager-7b555bff64-z6t98\" (UID: \"d7128bde-4264-4df6-b066-f6f9e789cd5f\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" Apr 16 20:34:17.586734 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.586655 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv687\" (UniqueName: \"kubernetes.io/projected/d7128bde-4264-4df6-b066-f6f9e789cd5f-kube-api-access-tv687\") pod \"lws-controller-manager-7b555bff64-z6t98\" (UID: \"d7128bde-4264-4df6-b066-f6f9e789cd5f\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" Apr 16 20:34:17.687373 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.687348 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tv687\" (UniqueName: \"kubernetes.io/projected/d7128bde-4264-4df6-b066-f6f9e789cd5f-kube-api-access-tv687\") pod \"lws-controller-manager-7b555bff64-z6t98\" (UID: \"d7128bde-4264-4df6-b066-f6f9e789cd5f\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" Apr 16 20:34:17.687769 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.687408 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d7128bde-4264-4df6-b066-f6f9e789cd5f-manager-config\") pod \"lws-controller-manager-7b555bff64-z6t98\" (UID: \"d7128bde-4264-4df6-b066-f6f9e789cd5f\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" Apr 16 20:34:17.687769 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.687426 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7128bde-4264-4df6-b066-f6f9e789cd5f-metrics-cert\") pod \"lws-controller-manager-7b555bff64-z6t98\" (UID: \"d7128bde-4264-4df6-b066-f6f9e789cd5f\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" Apr 16 20:34:17.687769 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.687486 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7128bde-4264-4df6-b066-f6f9e789cd5f-cert\") pod \"lws-controller-manager-7b555bff64-z6t98\" (UID: \"d7128bde-4264-4df6-b066-f6f9e789cd5f\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" Apr 16 20:34:17.688176 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.688153 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d7128bde-4264-4df6-b066-f6f9e789cd5f-manager-config\") pod \"lws-controller-manager-7b555bff64-z6t98\" (UID: \"d7128bde-4264-4df6-b066-f6f9e789cd5f\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" Apr 16 20:34:17.689838 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.689818 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7128bde-4264-4df6-b066-f6f9e789cd5f-cert\") pod \"lws-controller-manager-7b555bff64-z6t98\" (UID: \"d7128bde-4264-4df6-b066-f6f9e789cd5f\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" Apr 16 20:34:17.690004 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.689982 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7128bde-4264-4df6-b066-f6f9e789cd5f-metrics-cert\") pod \"lws-controller-manager-7b555bff64-z6t98\" (UID: \"d7128bde-4264-4df6-b066-f6f9e789cd5f\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" Apr 16 20:34:17.699815 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.699789 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv687\" (UniqueName: \"kubernetes.io/projected/d7128bde-4264-4df6-b066-f6f9e789cd5f-kube-api-access-tv687\") pod \"lws-controller-manager-7b555bff64-z6t98\" (UID: \"d7128bde-4264-4df6-b066-f6f9e789cd5f\") " pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" Apr 16 20:34:17.799587 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.799565 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" Apr 16 20:34:17.846899 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.846874 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" Apr 16 20:34:17.888645 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.888612 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7ebe6d28-f518-4718-b16d-77682681699a-bundle\") pod \"7ebe6d28-f518-4718-b16d-77682681699a\" (UID: \"7ebe6d28-f518-4718-b16d-77682681699a\") " Apr 16 20:34:17.888830 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.888676 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7ebe6d28-f518-4718-b16d-77682681699a-util\") pod \"7ebe6d28-f518-4718-b16d-77682681699a\" (UID: \"7ebe6d28-f518-4718-b16d-77682681699a\") " Apr 16 20:34:17.888830 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.888703 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mks6d\" (UniqueName: \"kubernetes.io/projected/7ebe6d28-f518-4718-b16d-77682681699a-kube-api-access-mks6d\") pod \"7ebe6d28-f518-4718-b16d-77682681699a\" (UID: \"7ebe6d28-f518-4718-b16d-77682681699a\") " Apr 16 20:34:17.889376 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.889348 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ebe6d28-f518-4718-b16d-77682681699a-bundle" (OuterVolumeSpecName: "bundle") pod "7ebe6d28-f518-4718-b16d-77682681699a" (UID: "7ebe6d28-f518-4718-b16d-77682681699a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:17.891289 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.891257 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ebe6d28-f518-4718-b16d-77682681699a-kube-api-access-mks6d" (OuterVolumeSpecName: "kube-api-access-mks6d") pod "7ebe6d28-f518-4718-b16d-77682681699a" (UID: "7ebe6d28-f518-4718-b16d-77682681699a"). InnerVolumeSpecName "kube-api-access-mks6d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:34:17.895287 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.895015 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ebe6d28-f518-4718-b16d-77682681699a-util" (OuterVolumeSpecName: "util") pod "7ebe6d28-f518-4718-b16d-77682681699a" (UID: "7ebe6d28-f518-4718-b16d-77682681699a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:17.969159 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.969082 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98"] Apr 16 20:34:17.971382 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:34:17.971356 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7128bde_4264_4df6_b066_f6f9e789cd5f.slice/crio-e57ed7fa4b5ef7db870cdb9554a6c012bd0563354ca74c117e55e5e8064f6942 WatchSource:0}: Error finding container e57ed7fa4b5ef7db870cdb9554a6c012bd0563354ca74c117e55e5e8064f6942: Status 404 returned error can't find the container with id e57ed7fa4b5ef7db870cdb9554a6c012bd0563354ca74c117e55e5e8064f6942 Apr 16 20:34:17.990366 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.990339 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7ebe6d28-f518-4718-b16d-77682681699a-bundle\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:34:17.990366 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.990362 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7ebe6d28-f518-4718-b16d-77682681699a-util\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:34:17.990513 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:17.990373 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mks6d\" (UniqueName: \"kubernetes.io/projected/7ebe6d28-f518-4718-b16d-77682681699a-kube-api-access-mks6d\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:34:18.685172 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:18.685134 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" event={"ID":"d7128bde-4264-4df6-b066-f6f9e789cd5f","Type":"ContainerStarted","Data":"e57ed7fa4b5ef7db870cdb9554a6c012bd0563354ca74c117e55e5e8064f6942"} Apr 16 20:34:18.686811 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:18.686790 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" Apr 16 20:34:18.686935 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:18.686789 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9q6lc2" event={"ID":"7ebe6d28-f518-4718-b16d-77682681699a","Type":"ContainerDied","Data":"60ab7881af2b34ed056a56a8ec8b65b3984b9e034d3044f6c9089199e3bb5f0e"} Apr 16 20:34:18.686935 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:18.686897 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60ab7881af2b34ed056a56a8ec8b65b3984b9e034d3044f6c9089199e3bb5f0e" Apr 16 20:34:19.691979 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:19.691941 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" event={"ID":"d7128bde-4264-4df6-b066-f6f9e789cd5f","Type":"ContainerStarted","Data":"5f55841ec9bffd5123db4d2e1a59b85c9049d9ed3ce7490f6647d995ca07486c"} Apr 16 20:34:19.692355 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:19.692038 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" Apr 16 20:34:19.708752 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:19.708699 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" podStartSLOduration=1.182032674 podStartE2EDuration="2.708685317s" podCreationTimestamp="2026-04-16 20:34:17 +0000 UTC" firstStartedPulling="2026-04-16 20:34:17.973149018 +0000 UTC m=+405.242069408" lastFinishedPulling="2026-04-16 20:34:19.499801648 +0000 UTC m=+406.768722051" observedRunningTime="2026-04-16 20:34:19.706904228 +0000 UTC m=+406.975824638" watchObservedRunningTime="2026-04-16 20:34:19.708685317 +0000 UTC m=+406.977605727" Apr 16 20:34:26.680262 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:26.680233 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6cc777b675-b9vl2" Apr 16 20:34:30.175519 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.175479 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j"] Apr 16 20:34:30.175940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.175923 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ebe6d28-f518-4718-b16d-77682681699a" containerName="pull" Apr 16 20:34:30.175993 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.175944 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebe6d28-f518-4718-b16d-77682681699a" containerName="pull" Apr 16 20:34:30.175993 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.175955 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ebe6d28-f518-4718-b16d-77682681699a" containerName="extract" Apr 16 20:34:30.175993 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.175963 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebe6d28-f518-4718-b16d-77682681699a" containerName="extract" Apr 16 20:34:30.175993 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.175990 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ebe6d28-f518-4718-b16d-77682681699a" containerName="util" Apr 16 20:34:30.176113 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.175996 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebe6d28-f518-4718-b16d-77682681699a" containerName="util" Apr 16 20:34:30.176113 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.176047 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ebe6d28-f518-4718-b16d-77682681699a" containerName="extract" Apr 16 20:34:30.181245 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.181225 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" Apr 16 20:34:30.184153 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.184129 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:34:30.184292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.184172 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:34:30.186166 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.186147 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5kphd\"" Apr 16 20:34:30.188445 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.188418 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j"] Apr 16 20:34:30.294810 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.294772 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88b876cc-98e5-485b-8841-6c5a6dd127b5-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j\" (UID: \"88b876cc-98e5-485b-8841-6c5a6dd127b5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" Apr 16 20:34:30.294810 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.294810 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtn8w\" (UniqueName: \"kubernetes.io/projected/88b876cc-98e5-485b-8841-6c5a6dd127b5-kube-api-access-wtn8w\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j\" (UID: \"88b876cc-98e5-485b-8841-6c5a6dd127b5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" Apr 16 20:34:30.295019 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.294867 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88b876cc-98e5-485b-8841-6c5a6dd127b5-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j\" (UID: \"88b876cc-98e5-485b-8841-6c5a6dd127b5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" Apr 16 20:34:30.396053 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.396006 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88b876cc-98e5-485b-8841-6c5a6dd127b5-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j\" (UID: \"88b876cc-98e5-485b-8841-6c5a6dd127b5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" Apr 16 20:34:30.396053 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.396061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtn8w\" (UniqueName: \"kubernetes.io/projected/88b876cc-98e5-485b-8841-6c5a6dd127b5-kube-api-access-wtn8w\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j\" (UID: \"88b876cc-98e5-485b-8841-6c5a6dd127b5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" Apr 16 20:34:30.396291 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.396132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88b876cc-98e5-485b-8841-6c5a6dd127b5-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j\" (UID: \"88b876cc-98e5-485b-8841-6c5a6dd127b5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" Apr 16 20:34:30.396523 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.396503 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88b876cc-98e5-485b-8841-6c5a6dd127b5-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j\" (UID: \"88b876cc-98e5-485b-8841-6c5a6dd127b5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" Apr 16 20:34:30.396583 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.396556 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88b876cc-98e5-485b-8841-6c5a6dd127b5-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j\" (UID: \"88b876cc-98e5-485b-8841-6c5a6dd127b5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" Apr 16 20:34:30.405109 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.405078 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtn8w\" (UniqueName: \"kubernetes.io/projected/88b876cc-98e5-485b-8841-6c5a6dd127b5-kube-api-access-wtn8w\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j\" (UID: \"88b876cc-98e5-485b-8841-6c5a6dd127b5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" Apr 16 20:34:30.491956 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.491871 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" Apr 16 20:34:30.656017 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.655962 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j"] Apr 16 20:34:30.657212 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:34:30.657180 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88b876cc_98e5_485b_8841_6c5a6dd127b5.slice/crio-4fc09a5f9f5d2a97b08a522d2b672c20e52dc19b5f8cf5bfd4debf3d532b8fa3 WatchSource:0}: Error finding container 4fc09a5f9f5d2a97b08a522d2b672c20e52dc19b5f8cf5bfd4debf3d532b8fa3: Status 404 returned error can't find the container with id 4fc09a5f9f5d2a97b08a522d2b672c20e52dc19b5f8cf5bfd4debf3d532b8fa3 Apr 16 20:34:30.696876 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.696847 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7b555bff64-z6t98" Apr 16 20:34:30.730740 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.730702 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" event={"ID":"88b876cc-98e5-485b-8841-6c5a6dd127b5","Type":"ContainerStarted","Data":"538d9fd5ef5ca64a99c61fc686bda6f9b0adcb88904fb32083758112ada0a3db"} Apr 16 20:34:30.730888 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:30.730746 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" event={"ID":"88b876cc-98e5-485b-8841-6c5a6dd127b5","Type":"ContainerStarted","Data":"4fc09a5f9f5d2a97b08a522d2b672c20e52dc19b5f8cf5bfd4debf3d532b8fa3"} Apr 16 20:34:31.735151 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:31.735112 2577 generic.go:358] "Generic (PLEG): container finished" podID="88b876cc-98e5-485b-8841-6c5a6dd127b5" containerID="538d9fd5ef5ca64a99c61fc686bda6f9b0adcb88904fb32083758112ada0a3db" exitCode=0 Apr 16 20:34:31.735564 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:31.735200 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" event={"ID":"88b876cc-98e5-485b-8841-6c5a6dd127b5","Type":"ContainerDied","Data":"538d9fd5ef5ca64a99c61fc686bda6f9b0adcb88904fb32083758112ada0a3db"} Apr 16 20:34:32.740407 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:32.740313 2577 generic.go:358] "Generic (PLEG): container finished" podID="88b876cc-98e5-485b-8841-6c5a6dd127b5" containerID="4dfa3c1fbbd53141960e33e440478f35de2b2eab6890ecce531eb3af9c4397da" exitCode=0 Apr 16 20:34:32.740799 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:32.740402 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" event={"ID":"88b876cc-98e5-485b-8841-6c5a6dd127b5","Type":"ContainerDied","Data":"4dfa3c1fbbd53141960e33e440478f35de2b2eab6890ecce531eb3af9c4397da"} Apr 16 20:34:33.745645 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:33.745611 2577 generic.go:358] "Generic (PLEG): container finished" podID="88b876cc-98e5-485b-8841-6c5a6dd127b5" containerID="9ba2ee7efbcdf91f944a075f266c83d01748799208abe0ed44811ac9028ac7c9" exitCode=0 Apr 16 20:34:33.746025 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:33.745700 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" event={"ID":"88b876cc-98e5-485b-8841-6c5a6dd127b5","Type":"ContainerDied","Data":"9ba2ee7efbcdf91f944a075f266c83d01748799208abe0ed44811ac9028ac7c9"} Apr 16 20:34:34.876114 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:34.876089 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" Apr 16 20:34:34.937717 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:34.937687 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtn8w\" (UniqueName: \"kubernetes.io/projected/88b876cc-98e5-485b-8841-6c5a6dd127b5-kube-api-access-wtn8w\") pod \"88b876cc-98e5-485b-8841-6c5a6dd127b5\" (UID: \"88b876cc-98e5-485b-8841-6c5a6dd127b5\") " Apr 16 20:34:34.937903 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:34.937736 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88b876cc-98e5-485b-8841-6c5a6dd127b5-bundle\") pod \"88b876cc-98e5-485b-8841-6c5a6dd127b5\" (UID: \"88b876cc-98e5-485b-8841-6c5a6dd127b5\") " Apr 16 20:34:34.937903 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:34.937813 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88b876cc-98e5-485b-8841-6c5a6dd127b5-util\") pod \"88b876cc-98e5-485b-8841-6c5a6dd127b5\" (UID: \"88b876cc-98e5-485b-8841-6c5a6dd127b5\") " Apr 16 20:34:34.938901 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:34.938870 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b876cc-98e5-485b-8841-6c5a6dd127b5-bundle" (OuterVolumeSpecName: "bundle") pod "88b876cc-98e5-485b-8841-6c5a6dd127b5" (UID: "88b876cc-98e5-485b-8841-6c5a6dd127b5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:34.939973 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:34.939953 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88b876cc-98e5-485b-8841-6c5a6dd127b5-kube-api-access-wtn8w" (OuterVolumeSpecName: "kube-api-access-wtn8w") pod "88b876cc-98e5-485b-8841-6c5a6dd127b5" (UID: "88b876cc-98e5-485b-8841-6c5a6dd127b5"). InnerVolumeSpecName "kube-api-access-wtn8w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:34:34.946495 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:34.946448 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b876cc-98e5-485b-8841-6c5a6dd127b5-util" (OuterVolumeSpecName: "util") pod "88b876cc-98e5-485b-8841-6c5a6dd127b5" (UID: "88b876cc-98e5-485b-8841-6c5a6dd127b5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:35.039354 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:35.039263 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wtn8w\" (UniqueName: \"kubernetes.io/projected/88b876cc-98e5-485b-8841-6c5a6dd127b5-kube-api-access-wtn8w\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:34:35.039354 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:35.039296 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88b876cc-98e5-485b-8841-6c5a6dd127b5-bundle\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:34:35.039354 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:35.039305 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88b876cc-98e5-485b-8841-6c5a6dd127b5-util\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:34:35.754611 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:35.754566 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" event={"ID":"88b876cc-98e5-485b-8841-6c5a6dd127b5","Type":"ContainerDied","Data":"4fc09a5f9f5d2a97b08a522d2b672c20e52dc19b5f8cf5bfd4debf3d532b8fa3"} Apr 16 20:34:35.754611 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:35.754605 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fc09a5f9f5d2a97b08a522d2b672c20e52dc19b5f8cf5bfd4debf3d532b8fa3" Apr 16 20:34:35.754611 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:35.754606 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355ss6j" Apr 16 20:34:44.571800 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.571762 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz"] Apr 16 20:34:44.572271 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.572119 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88b876cc-98e5-485b-8841-6c5a6dd127b5" containerName="pull" Apr 16 20:34:44.572271 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.572132 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b876cc-98e5-485b-8841-6c5a6dd127b5" containerName="pull" Apr 16 20:34:44.572271 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.572141 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88b876cc-98e5-485b-8841-6c5a6dd127b5" containerName="extract" Apr 16 20:34:44.572271 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.572146 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b876cc-98e5-485b-8841-6c5a6dd127b5" containerName="extract" Apr 16 20:34:44.572271 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.572153 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88b876cc-98e5-485b-8841-6c5a6dd127b5" containerName="util" Apr 16 20:34:44.572271 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.572158 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b876cc-98e5-485b-8841-6c5a6dd127b5" containerName="util" Apr 16 20:34:44.572271 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.572220 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="88b876cc-98e5-485b-8841-6c5a6dd127b5" containerName="extract" Apr 16 20:34:44.574961 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.574945 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" Apr 16 20:34:44.578779 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.578754 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:34:44.578930 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.578906 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5kphd\"" Apr 16 20:34:44.583248 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.583230 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:34:44.616257 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.616228 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz"] Apr 16 20:34:44.718625 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.718596 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04200f31-4540-49d7-9956-7580e8f136de-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz\" (UID: \"04200f31-4540-49d7-9956-7580e8f136de\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" Apr 16 20:34:44.718625 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.718629 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvwtn\" (UniqueName: \"kubernetes.io/projected/04200f31-4540-49d7-9956-7580e8f136de-kube-api-access-pvwtn\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz\" (UID: \"04200f31-4540-49d7-9956-7580e8f136de\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" Apr 16 20:34:44.718837 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.718656 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04200f31-4540-49d7-9956-7580e8f136de-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz\" (UID: \"04200f31-4540-49d7-9956-7580e8f136de\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" Apr 16 20:34:44.819798 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.819764 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04200f31-4540-49d7-9956-7580e8f136de-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz\" (UID: \"04200f31-4540-49d7-9956-7580e8f136de\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" Apr 16 20:34:44.819798 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.819802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvwtn\" (UniqueName: \"kubernetes.io/projected/04200f31-4540-49d7-9956-7580e8f136de-kube-api-access-pvwtn\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz\" (UID: \"04200f31-4540-49d7-9956-7580e8f136de\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" Apr 16 20:34:44.820014 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.819900 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04200f31-4540-49d7-9956-7580e8f136de-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz\" (UID: \"04200f31-4540-49d7-9956-7580e8f136de\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" Apr 16 20:34:44.820231 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.820205 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04200f31-4540-49d7-9956-7580e8f136de-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz\" (UID: \"04200f31-4540-49d7-9956-7580e8f136de\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" Apr 16 20:34:44.820269 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.820241 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04200f31-4540-49d7-9956-7580e8f136de-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz\" (UID: \"04200f31-4540-49d7-9956-7580e8f136de\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" Apr 16 20:34:44.880815 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.880744 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvwtn\" (UniqueName: \"kubernetes.io/projected/04200f31-4540-49d7-9956-7580e8f136de-kube-api-access-pvwtn\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz\" (UID: \"04200f31-4540-49d7-9956-7580e8f136de\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" Apr 16 20:34:44.883482 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:44.883455 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" Apr 16 20:34:45.016890 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:45.016861 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz"] Apr 16 20:34:45.018955 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:34:45.018922 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04200f31_4540_49d7_9956_7580e8f136de.slice/crio-171a850f30ee0de939efc4f72f0a6175b05fa3c11ac6d761ccff868ec4cd4d7d WatchSource:0}: Error finding container 171a850f30ee0de939efc4f72f0a6175b05fa3c11ac6d761ccff868ec4cd4d7d: Status 404 returned error can't find the container with id 171a850f30ee0de939efc4f72f0a6175b05fa3c11ac6d761ccff868ec4cd4d7d Apr 16 20:34:45.789374 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:45.789295 2577 generic.go:358] "Generic (PLEG): container finished" podID="04200f31-4540-49d7-9956-7580e8f136de" containerID="58feb4e5b567a4a083480c44933091bade0c7f8b644be741ec8bc3da9e8133c4" exitCode=0 Apr 16 20:34:45.789374 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:45.789344 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" event={"ID":"04200f31-4540-49d7-9956-7580e8f136de","Type":"ContainerDied","Data":"58feb4e5b567a4a083480c44933091bade0c7f8b644be741ec8bc3da9e8133c4"} Apr 16 20:34:45.789374 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:45.789372 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" event={"ID":"04200f31-4540-49d7-9956-7580e8f136de","Type":"ContainerStarted","Data":"171a850f30ee0de939efc4f72f0a6175b05fa3c11ac6d761ccff868ec4cd4d7d"} Apr 16 20:34:46.794652 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:46.794616 2577 generic.go:358] "Generic (PLEG): container finished" podID="04200f31-4540-49d7-9956-7580e8f136de" containerID="a27355e00fa2966d1d0b8acac85929351d6017e8c33ab30aa00a96a00a5566ad" exitCode=0 Apr 16 20:34:46.795048 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:46.794709 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" event={"ID":"04200f31-4540-49d7-9956-7580e8f136de","Type":"ContainerDied","Data":"a27355e00fa2966d1d0b8acac85929351d6017e8c33ab30aa00a96a00a5566ad"} Apr 16 20:34:47.800405 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:47.800370 2577 generic.go:358] "Generic (PLEG): container finished" podID="04200f31-4540-49d7-9956-7580e8f136de" containerID="b3ff7703039d40ea0419781d1a5acc9d1f7a549d659a010098a44e78ba9a38a1" exitCode=0 Apr 16 20:34:47.800790 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:47.800414 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" event={"ID":"04200f31-4540-49d7-9956-7580e8f136de","Type":"ContainerDied","Data":"b3ff7703039d40ea0419781d1a5acc9d1f7a549d659a010098a44e78ba9a38a1"} Apr 16 20:34:48.925404 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:48.925380 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" Apr 16 20:34:49.060260 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:49.060176 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04200f31-4540-49d7-9956-7580e8f136de-util\") pod \"04200f31-4540-49d7-9956-7580e8f136de\" (UID: \"04200f31-4540-49d7-9956-7580e8f136de\") " Apr 16 20:34:49.060406 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:49.060276 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04200f31-4540-49d7-9956-7580e8f136de-bundle\") pod \"04200f31-4540-49d7-9956-7580e8f136de\" (UID: \"04200f31-4540-49d7-9956-7580e8f136de\") " Apr 16 20:34:49.060406 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:49.060352 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvwtn\" (UniqueName: \"kubernetes.io/projected/04200f31-4540-49d7-9956-7580e8f136de-kube-api-access-pvwtn\") pod \"04200f31-4540-49d7-9956-7580e8f136de\" (UID: \"04200f31-4540-49d7-9956-7580e8f136de\") " Apr 16 20:34:49.061114 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:49.061085 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04200f31-4540-49d7-9956-7580e8f136de-bundle" (OuterVolumeSpecName: "bundle") pod "04200f31-4540-49d7-9956-7580e8f136de" (UID: "04200f31-4540-49d7-9956-7580e8f136de"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:49.062392 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:49.062368 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04200f31-4540-49d7-9956-7580e8f136de-kube-api-access-pvwtn" (OuterVolumeSpecName: "kube-api-access-pvwtn") pod "04200f31-4540-49d7-9956-7580e8f136de" (UID: "04200f31-4540-49d7-9956-7580e8f136de"). InnerVolumeSpecName "kube-api-access-pvwtn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:34:49.065877 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:49.065857 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04200f31-4540-49d7-9956-7580e8f136de-util" (OuterVolumeSpecName: "util") pod "04200f31-4540-49d7-9956-7580e8f136de" (UID: "04200f31-4540-49d7-9956-7580e8f136de"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:49.161224 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:49.161186 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pvwtn\" (UniqueName: \"kubernetes.io/projected/04200f31-4540-49d7-9956-7580e8f136de-kube-api-access-pvwtn\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:34:49.161224 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:49.161216 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04200f31-4540-49d7-9956-7580e8f136de-util\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:34:49.161224 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:49.161226 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04200f31-4540-49d7-9956-7580e8f136de-bundle\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:34:49.808239 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:49.808203 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" event={"ID":"04200f31-4540-49d7-9956-7580e8f136de","Type":"ContainerDied","Data":"171a850f30ee0de939efc4f72f0a6175b05fa3c11ac6d761ccff868ec4cd4d7d"} Apr 16 20:34:49.808411 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:49.808247 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="171a850f30ee0de939efc4f72f0a6175b05fa3c11ac6d761ccff868ec4cd4d7d" Apr 16 20:34:49.808411 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:34:49.808216 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bkkhz" Apr 16 20:35:01.481559 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.481522 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n"] Apr 16 20:35:01.481916 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.481824 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04200f31-4540-49d7-9956-7580e8f136de" containerName="util" Apr 16 20:35:01.481916 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.481834 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="04200f31-4540-49d7-9956-7580e8f136de" containerName="util" Apr 16 20:35:01.481916 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.481852 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04200f31-4540-49d7-9956-7580e8f136de" containerName="pull" Apr 16 20:35:01.481916 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.481858 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="04200f31-4540-49d7-9956-7580e8f136de" containerName="pull" Apr 16 20:35:01.481916 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.481869 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04200f31-4540-49d7-9956-7580e8f136de" containerName="extract" Apr 16 20:35:01.481916 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.481875 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="04200f31-4540-49d7-9956-7580e8f136de" containerName="extract" Apr 16 20:35:01.482206 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.481924 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="04200f31-4540-49d7-9956-7580e8f136de" containerName="extract" Apr 16 20:35:01.484318 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.484302 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.487027 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.487003 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-gq4gp\"" Apr 16 20:35:01.487148 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.487003 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 20:35:01.497382 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.497356 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n"] Apr 16 20:35:01.668985 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.668945 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ncmg\" (UniqueName: \"kubernetes.io/projected/fa4767c5-9765-4d74-bd02-c64bde512afe-kube-api-access-2ncmg\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.668985 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.668986 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/fa4767c5-9765-4d74-bd02-c64bde512afe-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.669245 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.669016 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/fa4767c5-9765-4d74-bd02-c64bde512afe-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.669245 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.669100 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/fa4767c5-9765-4d74-bd02-c64bde512afe-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.669245 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.669151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/fa4767c5-9765-4d74-bd02-c64bde512afe-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.669245 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.669189 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/fa4767c5-9765-4d74-bd02-c64bde512afe-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.669245 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.669232 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/fa4767c5-9765-4d74-bd02-c64bde512afe-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.669441 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.669263 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/fa4767c5-9765-4d74-bd02-c64bde512afe-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.669441 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.669343 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/fa4767c5-9765-4d74-bd02-c64bde512afe-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.770964 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.770868 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/fa4767c5-9765-4d74-bd02-c64bde512afe-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.770964 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.770932 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/fa4767c5-9765-4d74-bd02-c64bde512afe-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.771156 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.770971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/fa4767c5-9765-4d74-bd02-c64bde512afe-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.771156 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.771003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/fa4767c5-9765-4d74-bd02-c64bde512afe-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.771156 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.771060 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/fa4767c5-9765-4d74-bd02-c64bde512afe-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.771156 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.771094 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ncmg\" (UniqueName: \"kubernetes.io/projected/fa4767c5-9765-4d74-bd02-c64bde512afe-kube-api-access-2ncmg\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.771156 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.771123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/fa4767c5-9765-4d74-bd02-c64bde512afe-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.771426 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.771167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/fa4767c5-9765-4d74-bd02-c64bde512afe-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.771426 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.771211 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/fa4767c5-9765-4d74-bd02-c64bde512afe-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.771426 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.771340 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/fa4767c5-9765-4d74-bd02-c64bde512afe-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.771623 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.771574 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/fa4767c5-9765-4d74-bd02-c64bde512afe-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.771759 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.771736 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/fa4767c5-9765-4d74-bd02-c64bde512afe-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.771833 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.771801 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/fa4767c5-9765-4d74-bd02-c64bde512afe-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.771947 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.771927 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/fa4767c5-9765-4d74-bd02-c64bde512afe-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.773303 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.773281 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/fa4767c5-9765-4d74-bd02-c64bde512afe-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.773637 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.773618 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/fa4767c5-9765-4d74-bd02-c64bde512afe-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.779042 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.779008 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/fa4767c5-9765-4d74-bd02-c64bde512afe-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.779151 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.779134 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ncmg\" (UniqueName: \"kubernetes.io/projected/fa4767c5-9765-4d74-bd02-c64bde512afe-kube-api-access-2ncmg\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n\" (UID: \"fa4767c5-9765-4d74-bd02-c64bde512afe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.793951 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.793924 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:01.918328 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:01.918293 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n"] Apr 16 20:35:02.855224 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:02.855183 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" event={"ID":"fa4767c5-9765-4d74-bd02-c64bde512afe","Type":"ContainerStarted","Data":"f7a658b72b1b8362f749abd4688d7f5ccc65de564df33740c50b3c9b0b665fba"} Apr 16 20:35:04.407954 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:04.407915 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 20:35:04.408251 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:04.407985 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 20:35:04.408251 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:04.408009 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 20:35:04.866578 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:04.866538 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" event={"ID":"fa4767c5-9765-4d74-bd02-c64bde512afe","Type":"ContainerStarted","Data":"94755d940b8d27da3c251eda5df5fcef5b1008e25422b44a20b859bc1233184b"} Apr 16 20:35:04.886542 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:04.886493 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" podStartSLOduration=1.4033875789999999 podStartE2EDuration="3.886450841s" podCreationTimestamp="2026-04-16 20:35:01 +0000 UTC" firstStartedPulling="2026-04-16 20:35:01.924605943 +0000 UTC m=+449.193526331" lastFinishedPulling="2026-04-16 20:35:04.407669201 +0000 UTC m=+451.676589593" observedRunningTime="2026-04-16 20:35:04.884505201 +0000 UTC m=+452.153425608" watchObservedRunningTime="2026-04-16 20:35:04.886450841 +0000 UTC m=+452.155371524" Apr 16 20:35:05.794409 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:05.794367 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:05.799074 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:05.799049 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:05.870358 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:05.870325 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:05.871502 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:05.871455 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n" Apr 16 20:35:31.188880 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.188846 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dff2g"] Apr 16 20:35:31.191347 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.191331 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dff2g" Apr 16 20:35:31.193822 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.193796 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 20:35:31.194758 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.194737 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-6r44t\"" Apr 16 20:35:31.194863 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.194766 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 20:35:31.202366 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.202336 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dff2g"] Apr 16 20:35:31.314674 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.314632 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzhnr\" (UniqueName: \"kubernetes.io/projected/9de0a69c-c92c-4206-a5eb-a2dcd1fd3461-kube-api-access-vzhnr\") pod \"kuadrant-operator-catalog-dff2g\" (UID: \"9de0a69c-c92c-4206-a5eb-a2dcd1fd3461\") " pod="kuadrant-system/kuadrant-operator-catalog-dff2g" Apr 16 20:35:31.416148 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.416110 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzhnr\" (UniqueName: \"kubernetes.io/projected/9de0a69c-c92c-4206-a5eb-a2dcd1fd3461-kube-api-access-vzhnr\") pod \"kuadrant-operator-catalog-dff2g\" (UID: \"9de0a69c-c92c-4206-a5eb-a2dcd1fd3461\") " pod="kuadrant-system/kuadrant-operator-catalog-dff2g" Apr 16 20:35:31.426199 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.426170 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzhnr\" (UniqueName: \"kubernetes.io/projected/9de0a69c-c92c-4206-a5eb-a2dcd1fd3461-kube-api-access-vzhnr\") pod \"kuadrant-operator-catalog-dff2g\" (UID: \"9de0a69c-c92c-4206-a5eb-a2dcd1fd3461\") " pod="kuadrant-system/kuadrant-operator-catalog-dff2g" Apr 16 20:35:31.500706 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.500613 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dff2g" Apr 16 20:35:31.562499 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.557947 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dff2g"] Apr 16 20:35:31.652862 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.652836 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dff2g"] Apr 16 20:35:31.654596 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:35:31.654571 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9de0a69c_c92c_4206_a5eb_a2dcd1fd3461.slice/crio-82930ae47c013f63caab527affa642b11248d1cee63b50f8981e0facaf1c6008 WatchSource:0}: Error finding container 82930ae47c013f63caab527affa642b11248d1cee63b50f8981e0facaf1c6008: Status 404 returned error can't find the container with id 82930ae47c013f63caab527affa642b11248d1cee63b50f8981e0facaf1c6008 Apr 16 20:35:31.761759 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.761683 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-cglvc"] Apr 16 20:35:31.764673 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.764655 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-cglvc" Apr 16 20:35:31.771547 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.771523 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-cglvc"] Apr 16 20:35:31.819018 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.818978 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt8b4\" (UniqueName: \"kubernetes.io/projected/3da3d1bd-62b1-4e70-90b9-c2c4960f4461-kube-api-access-xt8b4\") pod \"kuadrant-operator-catalog-cglvc\" (UID: \"3da3d1bd-62b1-4e70-90b9-c2c4960f4461\") " pod="kuadrant-system/kuadrant-operator-catalog-cglvc" Apr 16 20:35:31.919701 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.919667 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt8b4\" (UniqueName: \"kubernetes.io/projected/3da3d1bd-62b1-4e70-90b9-c2c4960f4461-kube-api-access-xt8b4\") pod \"kuadrant-operator-catalog-cglvc\" (UID: \"3da3d1bd-62b1-4e70-90b9-c2c4960f4461\") " pod="kuadrant-system/kuadrant-operator-catalog-cglvc" Apr 16 20:35:31.927608 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.927583 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt8b4\" (UniqueName: \"kubernetes.io/projected/3da3d1bd-62b1-4e70-90b9-c2c4960f4461-kube-api-access-xt8b4\") pod \"kuadrant-operator-catalog-cglvc\" (UID: \"3da3d1bd-62b1-4e70-90b9-c2c4960f4461\") " pod="kuadrant-system/kuadrant-operator-catalog-cglvc" Apr 16 20:35:31.959960 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:31.959925 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dff2g" event={"ID":"9de0a69c-c92c-4206-a5eb-a2dcd1fd3461","Type":"ContainerStarted","Data":"82930ae47c013f63caab527affa642b11248d1cee63b50f8981e0facaf1c6008"} Apr 16 20:35:32.075921 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:32.075842 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-cglvc" Apr 16 20:35:32.202299 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:32.202270 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-cglvc"] Apr 16 20:35:32.234598 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:35:32.234564 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3da3d1bd_62b1_4e70_90b9_c2c4960f4461.slice/crio-1f550f23d0d36633cc90db0b4f74b97cd71dd6f3be3dd728f3ab669ba0278c72 WatchSource:0}: Error finding container 1f550f23d0d36633cc90db0b4f74b97cd71dd6f3be3dd728f3ab669ba0278c72: Status 404 returned error can't find the container with id 1f550f23d0d36633cc90db0b4f74b97cd71dd6f3be3dd728f3ab669ba0278c72 Apr 16 20:35:32.965983 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:32.965935 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-cglvc" event={"ID":"3da3d1bd-62b1-4e70-90b9-c2c4960f4461","Type":"ContainerStarted","Data":"1f550f23d0d36633cc90db0b4f74b97cd71dd6f3be3dd728f3ab669ba0278c72"} Apr 16 20:35:33.970966 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:33.970927 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-cglvc" event={"ID":"3da3d1bd-62b1-4e70-90b9-c2c4960f4461","Type":"ContainerStarted","Data":"d3d84a7632d5dabb98ef353d7652b745f495eb8036007eead985edd33bf4845d"} Apr 16 20:35:33.972169 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:33.972138 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dff2g" event={"ID":"9de0a69c-c92c-4206-a5eb-a2dcd1fd3461","Type":"ContainerStarted","Data":"19cae1cf2f62465cfa93d576b18ee3336423b92c628be2384f7b71a8eaea5b1b"} Apr 16 20:35:33.972333 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:33.972268 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-dff2g" podUID="9de0a69c-c92c-4206-a5eb-a2dcd1fd3461" containerName="registry-server" containerID="cri-o://19cae1cf2f62465cfa93d576b18ee3336423b92c628be2384f7b71a8eaea5b1b" gracePeriod=2 Apr 16 20:35:33.987820 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:33.987777 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-cglvc" podStartSLOduration=1.5559066449999999 podStartE2EDuration="2.987763157s" podCreationTimestamp="2026-04-16 20:35:31 +0000 UTC" firstStartedPulling="2026-04-16 20:35:32.235914448 +0000 UTC m=+479.504834836" lastFinishedPulling="2026-04-16 20:35:33.667770957 +0000 UTC m=+480.936691348" observedRunningTime="2026-04-16 20:35:33.984809406 +0000 UTC m=+481.253729816" watchObservedRunningTime="2026-04-16 20:35:33.987763157 +0000 UTC m=+481.256683566" Apr 16 20:35:34.000278 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:34.000229 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-dff2g" podStartSLOduration=0.997809735 podStartE2EDuration="3.000215192s" podCreationTimestamp="2026-04-16 20:35:31 +0000 UTC" firstStartedPulling="2026-04-16 20:35:31.65595 +0000 UTC m=+478.924870387" lastFinishedPulling="2026-04-16 20:35:33.658355456 +0000 UTC m=+480.927275844" observedRunningTime="2026-04-16 20:35:33.998162557 +0000 UTC m=+481.267082968" watchObservedRunningTime="2026-04-16 20:35:34.000215192 +0000 UTC m=+481.269135601" Apr 16 20:35:34.203533 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:34.203507 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dff2g" Apr 16 20:35:34.342204 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:34.342095 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzhnr\" (UniqueName: \"kubernetes.io/projected/9de0a69c-c92c-4206-a5eb-a2dcd1fd3461-kube-api-access-vzhnr\") pod \"9de0a69c-c92c-4206-a5eb-a2dcd1fd3461\" (UID: \"9de0a69c-c92c-4206-a5eb-a2dcd1fd3461\") " Apr 16 20:35:34.344256 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:34.344224 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9de0a69c-c92c-4206-a5eb-a2dcd1fd3461-kube-api-access-vzhnr" (OuterVolumeSpecName: "kube-api-access-vzhnr") pod "9de0a69c-c92c-4206-a5eb-a2dcd1fd3461" (UID: "9de0a69c-c92c-4206-a5eb-a2dcd1fd3461"). InnerVolumeSpecName "kube-api-access-vzhnr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:35:34.443664 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:34.443617 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vzhnr\" (UniqueName: \"kubernetes.io/projected/9de0a69c-c92c-4206-a5eb-a2dcd1fd3461-kube-api-access-vzhnr\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:35:34.977063 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:34.976966 2577 generic.go:358] "Generic (PLEG): container finished" podID="9de0a69c-c92c-4206-a5eb-a2dcd1fd3461" containerID="19cae1cf2f62465cfa93d576b18ee3336423b92c628be2384f7b71a8eaea5b1b" exitCode=0 Apr 16 20:35:34.977534 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:34.977060 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dff2g" Apr 16 20:35:34.977534 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:34.977057 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dff2g" event={"ID":"9de0a69c-c92c-4206-a5eb-a2dcd1fd3461","Type":"ContainerDied","Data":"19cae1cf2f62465cfa93d576b18ee3336423b92c628be2384f7b71a8eaea5b1b"} Apr 16 20:35:34.977534 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:34.977159 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dff2g" event={"ID":"9de0a69c-c92c-4206-a5eb-a2dcd1fd3461","Type":"ContainerDied","Data":"82930ae47c013f63caab527affa642b11248d1cee63b50f8981e0facaf1c6008"} Apr 16 20:35:34.977534 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:34.977177 2577 scope.go:117] "RemoveContainer" containerID="19cae1cf2f62465cfa93d576b18ee3336423b92c628be2384f7b71a8eaea5b1b" Apr 16 20:35:34.986496 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:34.986458 2577 scope.go:117] "RemoveContainer" containerID="19cae1cf2f62465cfa93d576b18ee3336423b92c628be2384f7b71a8eaea5b1b" Apr 16 20:35:34.986727 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:35:34.986707 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19cae1cf2f62465cfa93d576b18ee3336423b92c628be2384f7b71a8eaea5b1b\": container with ID starting with 19cae1cf2f62465cfa93d576b18ee3336423b92c628be2384f7b71a8eaea5b1b not found: ID does not exist" containerID="19cae1cf2f62465cfa93d576b18ee3336423b92c628be2384f7b71a8eaea5b1b" Apr 16 20:35:34.986785 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:34.986735 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19cae1cf2f62465cfa93d576b18ee3336423b92c628be2384f7b71a8eaea5b1b"} err="failed to get container status \"19cae1cf2f62465cfa93d576b18ee3336423b92c628be2384f7b71a8eaea5b1b\": rpc error: code = NotFound desc = could not find container \"19cae1cf2f62465cfa93d576b18ee3336423b92c628be2384f7b71a8eaea5b1b\": container with ID starting with 19cae1cf2f62465cfa93d576b18ee3336423b92c628be2384f7b71a8eaea5b1b not found: ID does not exist" Apr 16 20:35:34.997347 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:34.997325 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dff2g"] Apr 16 20:35:34.999351 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:34.999328 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dff2g"] Apr 16 20:35:35.284191 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:35.284112 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9de0a69c-c92c-4206-a5eb-a2dcd1fd3461" path="/var/lib/kubelet/pods/9de0a69c-c92c-4206-a5eb-a2dcd1fd3461/volumes" Apr 16 20:35:42.076384 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:42.076340 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-cglvc" Apr 16 20:35:42.076911 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:42.076428 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-cglvc" Apr 16 20:35:42.098392 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:42.098366 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-cglvc" Apr 16 20:35:43.027853 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:43.027824 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-cglvc" Apr 16 20:35:46.324060 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.324022 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj"] Apr 16 20:35:46.324427 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.324369 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9de0a69c-c92c-4206-a5eb-a2dcd1fd3461" containerName="registry-server" Apr 16 20:35:46.324427 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.324381 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de0a69c-c92c-4206-a5eb-a2dcd1fd3461" containerName="registry-server" Apr 16 20:35:46.324545 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.324429 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9de0a69c-c92c-4206-a5eb-a2dcd1fd3461" containerName="registry-server" Apr 16 20:35:46.326508 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.326491 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" Apr 16 20:35:46.328770 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.328744 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-w2krl\"" Apr 16 20:35:46.333868 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.333842 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj"] Apr 16 20:35:46.444820 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.444772 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pqwc\" (UniqueName: \"kubernetes.io/projected/165b09d2-b5da-4790-90e1-fb3e80407ae7-kube-api-access-8pqwc\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj\" (UID: \"165b09d2-b5da-4790-90e1-fb3e80407ae7\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" Apr 16 20:35:46.444982 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.444841 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/165b09d2-b5da-4790-90e1-fb3e80407ae7-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj\" (UID: \"165b09d2-b5da-4790-90e1-fb3e80407ae7\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" Apr 16 20:35:46.444982 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.444946 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/165b09d2-b5da-4790-90e1-fb3e80407ae7-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj\" (UID: \"165b09d2-b5da-4790-90e1-fb3e80407ae7\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" Apr 16 20:35:46.546004 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.545964 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/165b09d2-b5da-4790-90e1-fb3e80407ae7-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj\" (UID: \"165b09d2-b5da-4790-90e1-fb3e80407ae7\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" Apr 16 20:35:46.546183 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.546114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/165b09d2-b5da-4790-90e1-fb3e80407ae7-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj\" (UID: \"165b09d2-b5da-4790-90e1-fb3e80407ae7\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" Apr 16 20:35:46.546247 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.546193 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pqwc\" (UniqueName: \"kubernetes.io/projected/165b09d2-b5da-4790-90e1-fb3e80407ae7-kube-api-access-8pqwc\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj\" (UID: \"165b09d2-b5da-4790-90e1-fb3e80407ae7\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" Apr 16 20:35:46.546487 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.546439 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/165b09d2-b5da-4790-90e1-fb3e80407ae7-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj\" (UID: \"165b09d2-b5da-4790-90e1-fb3e80407ae7\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" Apr 16 20:35:46.546578 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.546450 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/165b09d2-b5da-4790-90e1-fb3e80407ae7-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj\" (UID: \"165b09d2-b5da-4790-90e1-fb3e80407ae7\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" Apr 16 20:35:46.554579 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.554553 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pqwc\" (UniqueName: \"kubernetes.io/projected/165b09d2-b5da-4790-90e1-fb3e80407ae7-kube-api-access-8pqwc\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj\" (UID: \"165b09d2-b5da-4790-90e1-fb3e80407ae7\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" Apr 16 20:35:46.636540 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.636431 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" Apr 16 20:35:46.764621 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.764596 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj"] Apr 16 20:35:46.766598 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:35:46.766568 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod165b09d2_b5da_4790_90e1_fb3e80407ae7.slice/crio-9141b663c6f5aa0adba7651fa2a5796da4c4ff68716620dbdaf7927cff07c99a WatchSource:0}: Error finding container 9141b663c6f5aa0adba7651fa2a5796da4c4ff68716620dbdaf7927cff07c99a: Status 404 returned error can't find the container with id 9141b663c6f5aa0adba7651fa2a5796da4c4ff68716620dbdaf7927cff07c99a Apr 16 20:35:46.922275 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.922238 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7"] Apr 16 20:35:46.924632 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.924615 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" Apr 16 20:35:46.933583 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:46.933559 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7"] Apr 16 20:35:47.022178 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.022141 2577 generic.go:358] "Generic (PLEG): container finished" podID="165b09d2-b5da-4790-90e1-fb3e80407ae7" containerID="42ba25dfa33b27ef29ca995b6660f5741e4b1c3145b41052dd9867abb598a93a" exitCode=0 Apr 16 20:35:47.022383 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.022200 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" event={"ID":"165b09d2-b5da-4790-90e1-fb3e80407ae7","Type":"ContainerDied","Data":"42ba25dfa33b27ef29ca995b6660f5741e4b1c3145b41052dd9867abb598a93a"} Apr 16 20:35:47.022383 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.022227 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" event={"ID":"165b09d2-b5da-4790-90e1-fb3e80407ae7","Type":"ContainerStarted","Data":"9141b663c6f5aa0adba7651fa2a5796da4c4ff68716620dbdaf7927cff07c99a"} Apr 16 20:35:47.053057 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.053026 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8051c81e-8645-464d-83f1-69568256cc62-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7\" (UID: \"8051c81e-8645-464d-83f1-69568256cc62\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" Apr 16 20:35:47.053196 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.053063 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8051c81e-8645-464d-83f1-69568256cc62-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7\" (UID: \"8051c81e-8645-464d-83f1-69568256cc62\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" Apr 16 20:35:47.053196 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.053091 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psqdq\" (UniqueName: \"kubernetes.io/projected/8051c81e-8645-464d-83f1-69568256cc62-kube-api-access-psqdq\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7\" (UID: \"8051c81e-8645-464d-83f1-69568256cc62\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" Apr 16 20:35:47.154247 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.154214 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8051c81e-8645-464d-83f1-69568256cc62-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7\" (UID: \"8051c81e-8645-464d-83f1-69568256cc62\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" Apr 16 20:35:47.154415 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.154263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8051c81e-8645-464d-83f1-69568256cc62-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7\" (UID: \"8051c81e-8645-464d-83f1-69568256cc62\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" Apr 16 20:35:47.154415 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.154391 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psqdq\" (UniqueName: \"kubernetes.io/projected/8051c81e-8645-464d-83f1-69568256cc62-kube-api-access-psqdq\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7\" (UID: \"8051c81e-8645-464d-83f1-69568256cc62\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" Apr 16 20:35:47.154721 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.154703 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8051c81e-8645-464d-83f1-69568256cc62-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7\" (UID: \"8051c81e-8645-464d-83f1-69568256cc62\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" Apr 16 20:35:47.154768 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.154727 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8051c81e-8645-464d-83f1-69568256cc62-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7\" (UID: \"8051c81e-8645-464d-83f1-69568256cc62\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" Apr 16 20:35:47.163083 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.163054 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psqdq\" (UniqueName: \"kubernetes.io/projected/8051c81e-8645-464d-83f1-69568256cc62-kube-api-access-psqdq\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7\" (UID: \"8051c81e-8645-464d-83f1-69568256cc62\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" Apr 16 20:35:47.234764 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.234673 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" Apr 16 20:35:47.359395 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.359367 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7"] Apr 16 20:35:47.361715 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:35:47.361687 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8051c81e_8645_464d_83f1_69568256cc62.slice/crio-6b797ccbe0057f2e83e16d0dcd13c0d153a3a7c129962a242979a2d882e7012b WatchSource:0}: Error finding container 6b797ccbe0057f2e83e16d0dcd13c0d153a3a7c129962a242979a2d882e7012b: Status 404 returned error can't find the container with id 6b797ccbe0057f2e83e16d0dcd13c0d153a3a7c129962a242979a2d882e7012b Apr 16 20:35:47.525013 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.524931 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr"] Apr 16 20:35:47.528030 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.528012 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" Apr 16 20:35:47.535917 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.535894 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr"] Apr 16 20:35:47.659138 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.659110 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrq8h\" (UniqueName: \"kubernetes.io/projected/f24b5e31-4ddd-4b62-b158-cc3346e320fe-kube-api-access-jrq8h\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr\" (UID: \"f24b5e31-4ddd-4b62-b158-cc3346e320fe\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" Apr 16 20:35:47.659249 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.659158 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f24b5e31-4ddd-4b62-b158-cc3346e320fe-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr\" (UID: \"f24b5e31-4ddd-4b62-b158-cc3346e320fe\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" Apr 16 20:35:47.659249 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.659209 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f24b5e31-4ddd-4b62-b158-cc3346e320fe-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr\" (UID: \"f24b5e31-4ddd-4b62-b158-cc3346e320fe\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" Apr 16 20:35:47.760040 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.760004 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrq8h\" (UniqueName: \"kubernetes.io/projected/f24b5e31-4ddd-4b62-b158-cc3346e320fe-kube-api-access-jrq8h\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr\" (UID: \"f24b5e31-4ddd-4b62-b158-cc3346e320fe\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" Apr 16 20:35:47.760205 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.760058 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f24b5e31-4ddd-4b62-b158-cc3346e320fe-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr\" (UID: \"f24b5e31-4ddd-4b62-b158-cc3346e320fe\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" Apr 16 20:35:47.760205 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.760089 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f24b5e31-4ddd-4b62-b158-cc3346e320fe-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr\" (UID: \"f24b5e31-4ddd-4b62-b158-cc3346e320fe\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" Apr 16 20:35:47.760431 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.760412 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f24b5e31-4ddd-4b62-b158-cc3346e320fe-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr\" (UID: \"f24b5e31-4ddd-4b62-b158-cc3346e320fe\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" Apr 16 20:35:47.760476 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.760446 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f24b5e31-4ddd-4b62-b158-cc3346e320fe-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr\" (UID: \"f24b5e31-4ddd-4b62-b158-cc3346e320fe\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" Apr 16 20:35:47.767971 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.767951 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrq8h\" (UniqueName: \"kubernetes.io/projected/f24b5e31-4ddd-4b62-b158-cc3346e320fe-kube-api-access-jrq8h\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr\" (UID: \"f24b5e31-4ddd-4b62-b158-cc3346e320fe\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" Apr 16 20:35:47.838097 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.838008 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" Apr 16 20:35:47.925193 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.925155 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft"] Apr 16 20:35:47.929134 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.929105 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" Apr 16 20:35:47.935851 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.935786 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft"] Apr 16 20:35:47.967766 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:47.967727 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr"] Apr 16 20:35:47.969772 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:35:47.969745 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf24b5e31_4ddd_4b62_b158_cc3346e320fe.slice/crio-cba57f95c9b663986584c42e86480dc389af75d6190577a19701747fbcd25ee3 WatchSource:0}: Error finding container cba57f95c9b663986584c42e86480dc389af75d6190577a19701747fbcd25ee3: Status 404 returned error can't find the container with id cba57f95c9b663986584c42e86480dc389af75d6190577a19701747fbcd25ee3 Apr 16 20:35:48.027207 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:48.027168 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" event={"ID":"f24b5e31-4ddd-4b62-b158-cc3346e320fe","Type":"ContainerStarted","Data":"cba57f95c9b663986584c42e86480dc389af75d6190577a19701747fbcd25ee3"} Apr 16 20:35:48.028388 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:48.028363 2577 generic.go:358] "Generic (PLEG): container finished" podID="8051c81e-8645-464d-83f1-69568256cc62" containerID="97401b4668a26ca0cdcca375487be8d3dba45cc1e6b3d081c0e56e79893d41ca" exitCode=0 Apr 16 20:35:48.028500 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:48.028440 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" event={"ID":"8051c81e-8645-464d-83f1-69568256cc62","Type":"ContainerDied","Data":"97401b4668a26ca0cdcca375487be8d3dba45cc1e6b3d081c0e56e79893d41ca"} Apr 16 20:35:48.028500 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:48.028490 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" event={"ID":"8051c81e-8645-464d-83f1-69568256cc62","Type":"ContainerStarted","Data":"6b797ccbe0057f2e83e16d0dcd13c0d153a3a7c129962a242979a2d882e7012b"} Apr 16 20:35:48.029908 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:48.029885 2577 generic.go:358] "Generic (PLEG): container finished" podID="165b09d2-b5da-4790-90e1-fb3e80407ae7" containerID="d965db3d39578ed969206f1050336ec5d5816108db54fe82a6f7028e90c73e0e" exitCode=0 Apr 16 20:35:48.029998 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:48.029932 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" event={"ID":"165b09d2-b5da-4790-90e1-fb3e80407ae7","Type":"ContainerDied","Data":"d965db3d39578ed969206f1050336ec5d5816108db54fe82a6f7028e90c73e0e"} Apr 16 20:35:48.062688 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:48.062651 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft\" (UID: \"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" Apr 16 20:35:48.062860 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:48.062717 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft\" (UID: \"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" Apr 16 20:35:48.062860 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:48.062803 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l7wb\" (UniqueName: \"kubernetes.io/projected/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-kube-api-access-4l7wb\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft\" (UID: \"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" Apr 16 20:35:48.164182 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:48.164151 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft\" (UID: \"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" Apr 16 20:35:48.164327 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:48.164246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft\" (UID: \"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" Apr 16 20:35:48.164327 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:48.164267 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4l7wb\" (UniqueName: \"kubernetes.io/projected/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-kube-api-access-4l7wb\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft\" (UID: \"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" Apr 16 20:35:48.164582 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:48.164552 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft\" (UID: \"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" Apr 16 20:35:48.164669 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:48.164603 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft\" (UID: \"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" Apr 16 20:35:48.171757 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:48.171731 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l7wb\" (UniqueName: \"kubernetes.io/projected/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-kube-api-access-4l7wb\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft\" (UID: \"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" Apr 16 20:35:48.243119 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:48.243084 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" Apr 16 20:35:48.365067 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:48.365036 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft"] Apr 16 20:35:48.367664 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:35:48.367634 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc929b21c_5a05_4bd7_bbb0_2a86f8d1d9b1.slice/crio-931eb7839e5a7325861108400796a4a8fd620e78c52cf87129c8ca417b51ce19 WatchSource:0}: Error finding container 931eb7839e5a7325861108400796a4a8fd620e78c52cf87129c8ca417b51ce19: Status 404 returned error can't find the container with id 931eb7839e5a7325861108400796a4a8fd620e78c52cf87129c8ca417b51ce19 Apr 16 20:35:49.035040 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:49.034952 2577 generic.go:358] "Generic (PLEG): container finished" podID="c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1" containerID="412119e01d41c7064d9c9ebd55524d2fd2f61e365f77d391ed00359198ba35d9" exitCode=0 Apr 16 20:35:49.035193 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:49.035039 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" event={"ID":"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1","Type":"ContainerDied","Data":"412119e01d41c7064d9c9ebd55524d2fd2f61e365f77d391ed00359198ba35d9"} Apr 16 20:35:49.035193 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:49.035071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" event={"ID":"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1","Type":"ContainerStarted","Data":"931eb7839e5a7325861108400796a4a8fd620e78c52cf87129c8ca417b51ce19"} Apr 16 20:35:49.036366 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:49.036342 2577 generic.go:358] "Generic (PLEG): container finished" podID="f24b5e31-4ddd-4b62-b158-cc3346e320fe" containerID="87feebc58d478331d16ece8870877e183c1563153ead3260dbc894aa2f1f7b90" exitCode=0 Apr 16 20:35:49.036458 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:49.036429 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" event={"ID":"f24b5e31-4ddd-4b62-b158-cc3346e320fe","Type":"ContainerDied","Data":"87feebc58d478331d16ece8870877e183c1563153ead3260dbc894aa2f1f7b90"} Apr 16 20:35:49.038079 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:49.038057 2577 generic.go:358] "Generic (PLEG): container finished" podID="8051c81e-8645-464d-83f1-69568256cc62" containerID="108ae775873286f25ccb91cc2812ba80fd05c2393204e80083fa7042aa71d3d3" exitCode=0 Apr 16 20:35:49.038166 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:49.038122 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" event={"ID":"8051c81e-8645-464d-83f1-69568256cc62","Type":"ContainerDied","Data":"108ae775873286f25ccb91cc2812ba80fd05c2393204e80083fa7042aa71d3d3"} Apr 16 20:35:49.040179 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:49.040151 2577 generic.go:358] "Generic (PLEG): container finished" podID="165b09d2-b5da-4790-90e1-fb3e80407ae7" containerID="ec2981f4f015fcdc0d0795af62e26e66bb0c74b8ff6ae99c300f011cd9c2996b" exitCode=0 Apr 16 20:35:49.040369 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:49.040176 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" event={"ID":"165b09d2-b5da-4790-90e1-fb3e80407ae7","Type":"ContainerDied","Data":"ec2981f4f015fcdc0d0795af62e26e66bb0c74b8ff6ae99c300f011cd9c2996b"} Apr 16 20:35:50.049443 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:50.049410 2577 generic.go:358] "Generic (PLEG): container finished" podID="c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1" containerID="a49fed19f0a6ad2594da163f611bc26fd921a55ab7ffdd4a28765c15be763196" exitCode=0 Apr 16 20:35:50.049878 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:50.049510 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" event={"ID":"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1","Type":"ContainerDied","Data":"a49fed19f0a6ad2594da163f611bc26fd921a55ab7ffdd4a28765c15be763196"} Apr 16 20:35:50.051388 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:50.051232 2577 generic.go:358] "Generic (PLEG): container finished" podID="f24b5e31-4ddd-4b62-b158-cc3346e320fe" containerID="6aabd1181fd5a0724a561239839864bae88eaad266a3777eb4daaa5e2b9a454e" exitCode=0 Apr 16 20:35:50.051388 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:50.051311 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" event={"ID":"f24b5e31-4ddd-4b62-b158-cc3346e320fe","Type":"ContainerDied","Data":"6aabd1181fd5a0724a561239839864bae88eaad266a3777eb4daaa5e2b9a454e"} Apr 16 20:35:50.053310 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:50.053284 2577 generic.go:358] "Generic (PLEG): container finished" podID="8051c81e-8645-464d-83f1-69568256cc62" containerID="775e0fabfea504e0231ad40b8a6abe3975dd6e8c36eab8012d218604628de902" exitCode=0 Apr 16 20:35:50.053409 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:50.053329 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" event={"ID":"8051c81e-8645-464d-83f1-69568256cc62","Type":"ContainerDied","Data":"775e0fabfea504e0231ad40b8a6abe3975dd6e8c36eab8012d218604628de902"} Apr 16 20:35:50.180644 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:50.180583 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" Apr 16 20:35:50.283680 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:50.283652 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pqwc\" (UniqueName: \"kubernetes.io/projected/165b09d2-b5da-4790-90e1-fb3e80407ae7-kube-api-access-8pqwc\") pod \"165b09d2-b5da-4790-90e1-fb3e80407ae7\" (UID: \"165b09d2-b5da-4790-90e1-fb3e80407ae7\") " Apr 16 20:35:50.283848 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:50.283733 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/165b09d2-b5da-4790-90e1-fb3e80407ae7-bundle\") pod \"165b09d2-b5da-4790-90e1-fb3e80407ae7\" (UID: \"165b09d2-b5da-4790-90e1-fb3e80407ae7\") " Apr 16 20:35:50.283848 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:50.283752 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/165b09d2-b5da-4790-90e1-fb3e80407ae7-util\") pod \"165b09d2-b5da-4790-90e1-fb3e80407ae7\" (UID: \"165b09d2-b5da-4790-90e1-fb3e80407ae7\") " Apr 16 20:35:50.284211 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:50.284186 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/165b09d2-b5da-4790-90e1-fb3e80407ae7-bundle" (OuterVolumeSpecName: "bundle") pod "165b09d2-b5da-4790-90e1-fb3e80407ae7" (UID: "165b09d2-b5da-4790-90e1-fb3e80407ae7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:35:50.285730 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:50.285708 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/165b09d2-b5da-4790-90e1-fb3e80407ae7-kube-api-access-8pqwc" (OuterVolumeSpecName: "kube-api-access-8pqwc") pod "165b09d2-b5da-4790-90e1-fb3e80407ae7" (UID: "165b09d2-b5da-4790-90e1-fb3e80407ae7"). InnerVolumeSpecName "kube-api-access-8pqwc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:35:50.289032 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:50.288996 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/165b09d2-b5da-4790-90e1-fb3e80407ae7-util" (OuterVolumeSpecName: "util") pod "165b09d2-b5da-4790-90e1-fb3e80407ae7" (UID: "165b09d2-b5da-4790-90e1-fb3e80407ae7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:35:50.384898 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:50.384849 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8pqwc\" (UniqueName: \"kubernetes.io/projected/165b09d2-b5da-4790-90e1-fb3e80407ae7-kube-api-access-8pqwc\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:35:50.384898 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:50.384885 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/165b09d2-b5da-4790-90e1-fb3e80407ae7-bundle\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:35:50.384898 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:50.384900 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/165b09d2-b5da-4790-90e1-fb3e80407ae7-util\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:35:51.059392 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.059311 2577 generic.go:358] "Generic (PLEG): container finished" podID="f24b5e31-4ddd-4b62-b158-cc3346e320fe" containerID="cc786bb7ae1f34aef11479fcd486f91d60018e1fd24023624a09f8afa3259e99" exitCode=0 Apr 16 20:35:51.059834 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.059398 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" event={"ID":"f24b5e31-4ddd-4b62-b158-cc3346e320fe","Type":"ContainerDied","Data":"cc786bb7ae1f34aef11479fcd486f91d60018e1fd24023624a09f8afa3259e99"} Apr 16 20:35:51.061083 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.061063 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" event={"ID":"165b09d2-b5da-4790-90e1-fb3e80407ae7","Type":"ContainerDied","Data":"9141b663c6f5aa0adba7651fa2a5796da4c4ff68716620dbdaf7927cff07c99a"} Apr 16 20:35:51.061195 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.061087 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9141b663c6f5aa0adba7651fa2a5796da4c4ff68716620dbdaf7927cff07c99a" Apr 16 20:35:51.061195 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.061066 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj" Apr 16 20:35:51.063011 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.062985 2577 generic.go:358] "Generic (PLEG): container finished" podID="c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1" containerID="c51253eb9f57f56552fe3378a8cca36d223dd071f7f97133c9ccb0d3b5ac0b05" exitCode=0 Apr 16 20:35:51.063105 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.063068 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" event={"ID":"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1","Type":"ContainerDied","Data":"c51253eb9f57f56552fe3378a8cca36d223dd071f7f97133c9ccb0d3b5ac0b05"} Apr 16 20:35:51.189713 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.189689 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" Apr 16 20:35:51.292121 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.292078 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8051c81e-8645-464d-83f1-69568256cc62-bundle\") pod \"8051c81e-8645-464d-83f1-69568256cc62\" (UID: \"8051c81e-8645-464d-83f1-69568256cc62\") " Apr 16 20:35:51.292285 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.292168 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psqdq\" (UniqueName: \"kubernetes.io/projected/8051c81e-8645-464d-83f1-69568256cc62-kube-api-access-psqdq\") pod \"8051c81e-8645-464d-83f1-69568256cc62\" (UID: \"8051c81e-8645-464d-83f1-69568256cc62\") " Apr 16 20:35:51.292285 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.292235 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8051c81e-8645-464d-83f1-69568256cc62-util\") pod \"8051c81e-8645-464d-83f1-69568256cc62\" (UID: \"8051c81e-8645-464d-83f1-69568256cc62\") " Apr 16 20:35:51.292651 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.292616 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8051c81e-8645-464d-83f1-69568256cc62-bundle" (OuterVolumeSpecName: "bundle") pod "8051c81e-8645-464d-83f1-69568256cc62" (UID: "8051c81e-8645-464d-83f1-69568256cc62"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:35:51.294244 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.294220 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8051c81e-8645-464d-83f1-69568256cc62-kube-api-access-psqdq" (OuterVolumeSpecName: "kube-api-access-psqdq") pod "8051c81e-8645-464d-83f1-69568256cc62" (UID: "8051c81e-8645-464d-83f1-69568256cc62"). InnerVolumeSpecName "kube-api-access-psqdq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:35:51.297865 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.297832 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8051c81e-8645-464d-83f1-69568256cc62-util" (OuterVolumeSpecName: "util") pod "8051c81e-8645-464d-83f1-69568256cc62" (UID: "8051c81e-8645-464d-83f1-69568256cc62"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:35:51.393489 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.393380 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-psqdq\" (UniqueName: \"kubernetes.io/projected/8051c81e-8645-464d-83f1-69568256cc62-kube-api-access-psqdq\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:35:51.393489 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.393417 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8051c81e-8645-464d-83f1-69568256cc62-util\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:35:51.393489 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.393428 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8051c81e-8645-464d-83f1-69568256cc62-bundle\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:35:51.445562 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.445525 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-cd5f8c8f8-pcrpc"] Apr 16 20:35:51.445912 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.445895 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="165b09d2-b5da-4790-90e1-fb3e80407ae7" containerName="util" Apr 16 20:35:51.446011 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.445915 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="165b09d2-b5da-4790-90e1-fb3e80407ae7" containerName="util" Apr 16 20:35:51.446011 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.445928 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8051c81e-8645-464d-83f1-69568256cc62" containerName="pull" Apr 16 20:35:51.446011 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.445936 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8051c81e-8645-464d-83f1-69568256cc62" containerName="pull" Apr 16 20:35:51.446011 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.445944 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8051c81e-8645-464d-83f1-69568256cc62" containerName="extract" Apr 16 20:35:51.446011 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.445952 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8051c81e-8645-464d-83f1-69568256cc62" containerName="extract" Apr 16 20:35:51.446011 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.445975 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="165b09d2-b5da-4790-90e1-fb3e80407ae7" containerName="pull" Apr 16 20:35:51.446011 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.445983 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="165b09d2-b5da-4790-90e1-fb3e80407ae7" containerName="pull" Apr 16 20:35:51.446011 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.445992 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="165b09d2-b5da-4790-90e1-fb3e80407ae7" containerName="extract" Apr 16 20:35:51.446011 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.446000 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="165b09d2-b5da-4790-90e1-fb3e80407ae7" containerName="extract" Apr 16 20:35:51.446412 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.446022 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8051c81e-8645-464d-83f1-69568256cc62" containerName="util" Apr 16 20:35:51.446412 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.446030 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8051c81e-8645-464d-83f1-69568256cc62" containerName="util" Apr 16 20:35:51.446412 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.446124 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="8051c81e-8645-464d-83f1-69568256cc62" containerName="extract" Apr 16 20:35:51.446412 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.446136 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="165b09d2-b5da-4790-90e1-fb3e80407ae7" containerName="extract" Apr 16 20:35:51.448830 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.448810 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.464505 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.464478 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cd5f8c8f8-pcrpc"] Apr 16 20:35:51.595130 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.595093 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-console-config\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.595300 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.595142 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-console-serving-cert\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.595300 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.595168 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-oauth-serving-cert\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.595300 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.595211 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9hg8\" (UniqueName: \"kubernetes.io/projected/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-kube-api-access-s9hg8\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.595300 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.595252 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-service-ca\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.595300 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.595270 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-trusted-ca-bundle\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.595300 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.595290 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-console-oauth-config\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.696723 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.696686 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-console-config\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.696882 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.696729 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-console-serving-cert\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.696882 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.696802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-oauth-serving-cert\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.696882 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.696833 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9hg8\" (UniqueName: \"kubernetes.io/projected/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-kube-api-access-s9hg8\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.697030 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.696882 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-service-ca\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.697030 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.696909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-trusted-ca-bundle\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.697030 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.696936 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-console-oauth-config\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.697578 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.697543 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-console-config\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.697706 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.697601 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-service-ca\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.697706 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.697625 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-oauth-serving-cert\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.697805 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.697724 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-trusted-ca-bundle\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.699303 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.699286 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-console-oauth-config\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.699378 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.699362 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-console-serving-cert\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.704418 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.704400 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9hg8\" (UniqueName: \"kubernetes.io/projected/42e7fcdb-a2f7-4a91-88f6-cc1acc359a83-kube-api-access-s9hg8\") pod \"console-cd5f8c8f8-pcrpc\" (UID: \"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83\") " pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.757536 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.757499 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:35:51.882533 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:51.882509 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cd5f8c8f8-pcrpc"] Apr 16 20:35:51.884429 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:35:51.884398 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42e7fcdb_a2f7_4a91_88f6_cc1acc359a83.slice/crio-3e004fb5ffc276d30045a8ac93543eaec3ee2c9e54ca029746c4886164248bf7 WatchSource:0}: Error finding container 3e004fb5ffc276d30045a8ac93543eaec3ee2c9e54ca029746c4886164248bf7: Status 404 returned error can't find the container with id 3e004fb5ffc276d30045a8ac93543eaec3ee2c9e54ca029746c4886164248bf7 Apr 16 20:35:52.068899 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.068816 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" event={"ID":"8051c81e-8645-464d-83f1-69568256cc62","Type":"ContainerDied","Data":"6b797ccbe0057f2e83e16d0dcd13c0d153a3a7c129962a242979a2d882e7012b"} Apr 16 20:35:52.068899 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.068853 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b797ccbe0057f2e83e16d0dcd13c0d153a3a7c129962a242979a2d882e7012b" Apr 16 20:35:52.068899 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.068832 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7" Apr 16 20:35:52.070227 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.070196 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cd5f8c8f8-pcrpc" event={"ID":"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83","Type":"ContainerStarted","Data":"4b952f3dddea6cabea1cf28090953ceb0f7d20abdbbae9df0fc46c633a2acc55"} Apr 16 20:35:52.070365 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.070230 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cd5f8c8f8-pcrpc" event={"ID":"42e7fcdb-a2f7-4a91-88f6-cc1acc359a83","Type":"ContainerStarted","Data":"3e004fb5ffc276d30045a8ac93543eaec3ee2c9e54ca029746c4886164248bf7"} Apr 16 20:35:52.100055 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.099782 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cd5f8c8f8-pcrpc" podStartSLOduration=1.099763193 podStartE2EDuration="1.099763193s" podCreationTimestamp="2026-04-16 20:35:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:35:52.096318529 +0000 UTC m=+499.365238940" watchObservedRunningTime="2026-04-16 20:35:52.099763193 +0000 UTC m=+499.368683605" Apr 16 20:35:52.196349 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.196321 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" Apr 16 20:35:52.229871 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.229849 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" Apr 16 20:35:52.302118 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.302080 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-bundle\") pod \"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1\" (UID: \"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1\") " Apr 16 20:35:52.302269 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.302164 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l7wb\" (UniqueName: \"kubernetes.io/projected/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-kube-api-access-4l7wb\") pod \"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1\" (UID: \"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1\") " Apr 16 20:35:52.302308 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.302291 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-util\") pod \"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1\" (UID: \"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1\") " Apr 16 20:35:52.302927 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.302887 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-bundle" (OuterVolumeSpecName: "bundle") pod "c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1" (UID: "c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:35:52.304247 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.304225 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-kube-api-access-4l7wb" (OuterVolumeSpecName: "kube-api-access-4l7wb") pod "c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1" (UID: "c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1"). InnerVolumeSpecName "kube-api-access-4l7wb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:35:52.307611 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.307590 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-util" (OuterVolumeSpecName: "util") pod "c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1" (UID: "c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:35:52.403791 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.403710 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f24b5e31-4ddd-4b62-b158-cc3346e320fe-bundle\") pod \"f24b5e31-4ddd-4b62-b158-cc3346e320fe\" (UID: \"f24b5e31-4ddd-4b62-b158-cc3346e320fe\") " Apr 16 20:35:52.403791 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.403774 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f24b5e31-4ddd-4b62-b158-cc3346e320fe-util\") pod \"f24b5e31-4ddd-4b62-b158-cc3346e320fe\" (UID: \"f24b5e31-4ddd-4b62-b158-cc3346e320fe\") " Apr 16 20:35:52.403934 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.403832 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrq8h\" (UniqueName: \"kubernetes.io/projected/f24b5e31-4ddd-4b62-b158-cc3346e320fe-kube-api-access-jrq8h\") pod \"f24b5e31-4ddd-4b62-b158-cc3346e320fe\" (UID: \"f24b5e31-4ddd-4b62-b158-cc3346e320fe\") " Apr 16 20:35:52.404065 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.404045 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-util\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:35:52.404097 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.404072 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-bundle\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:35:52.404097 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.404087 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4l7wb\" (UniqueName: \"kubernetes.io/projected/c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1-kube-api-access-4l7wb\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:35:52.404330 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.404297 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f24b5e31-4ddd-4b62-b158-cc3346e320fe-bundle" (OuterVolumeSpecName: "bundle") pod "f24b5e31-4ddd-4b62-b158-cc3346e320fe" (UID: "f24b5e31-4ddd-4b62-b158-cc3346e320fe"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:35:52.405978 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.405955 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f24b5e31-4ddd-4b62-b158-cc3346e320fe-kube-api-access-jrq8h" (OuterVolumeSpecName: "kube-api-access-jrq8h") pod "f24b5e31-4ddd-4b62-b158-cc3346e320fe" (UID: "f24b5e31-4ddd-4b62-b158-cc3346e320fe"). InnerVolumeSpecName "kube-api-access-jrq8h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:35:52.409216 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.409193 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f24b5e31-4ddd-4b62-b158-cc3346e320fe-util" (OuterVolumeSpecName: "util") pod "f24b5e31-4ddd-4b62-b158-cc3346e320fe" (UID: "f24b5e31-4ddd-4b62-b158-cc3346e320fe"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:35:52.504553 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.504516 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jrq8h\" (UniqueName: \"kubernetes.io/projected/f24b5e31-4ddd-4b62-b158-cc3346e320fe-kube-api-access-jrq8h\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:35:52.504553 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.504546 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f24b5e31-4ddd-4b62-b158-cc3346e320fe-bundle\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:35:52.504553 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:52.504559 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f24b5e31-4ddd-4b62-b158-cc3346e320fe-util\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:35:53.075574 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:53.075497 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" Apr 16 20:35:53.075967 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:53.075493 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft" event={"ID":"c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1","Type":"ContainerDied","Data":"931eb7839e5a7325861108400796a4a8fd620e78c52cf87129c8ca417b51ce19"} Apr 16 20:35:53.075967 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:53.075660 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="931eb7839e5a7325861108400796a4a8fd620e78c52cf87129c8ca417b51ce19" Apr 16 20:35:53.077300 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:53.077277 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" event={"ID":"f24b5e31-4ddd-4b62-b158-cc3346e320fe","Type":"ContainerDied","Data":"cba57f95c9b663986584c42e86480dc389af75d6190577a19701747fbcd25ee3"} Apr 16 20:35:53.077404 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:53.077303 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cba57f95c9b663986584c42e86480dc389af75d6190577a19701747fbcd25ee3" Apr 16 20:35:53.077404 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:35:53.077327 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr" Apr 16 20:36:01.758593 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:01.758554 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:36:01.759085 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:01.758606 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:36:01.763606 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:01.763580 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:36:02.123243 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.123164 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cd5f8c8f8-pcrpc" Apr 16 20:36:02.173252 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.173211 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cbd54666c-prnm9"] Apr 16 20:36:02.454060 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.454027 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-cqqdz"] Apr 16 20:36:02.454366 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.454353 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f24b5e31-4ddd-4b62-b158-cc3346e320fe" containerName="pull" Apr 16 20:36:02.454409 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.454367 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24b5e31-4ddd-4b62-b158-cc3346e320fe" containerName="pull" Apr 16 20:36:02.454409 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.454383 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1" containerName="pull" Apr 16 20:36:02.454409 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.454388 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1" containerName="pull" Apr 16 20:36:02.454409 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.454394 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f24b5e31-4ddd-4b62-b158-cc3346e320fe" containerName="extract" Apr 16 20:36:02.454409 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.454400 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24b5e31-4ddd-4b62-b158-cc3346e320fe" containerName="extract" Apr 16 20:36:02.454409 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.454410 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f24b5e31-4ddd-4b62-b158-cc3346e320fe" containerName="util" Apr 16 20:36:02.454614 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.454416 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24b5e31-4ddd-4b62-b158-cc3346e320fe" containerName="util" Apr 16 20:36:02.454614 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.454422 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1" containerName="util" Apr 16 20:36:02.454614 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.454428 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1" containerName="util" Apr 16 20:36:02.454614 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.454433 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1" containerName="extract" Apr 16 20:36:02.454614 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.454438 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1" containerName="extract" Apr 16 20:36:02.454614 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.454513 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f24b5e31-4ddd-4b62-b158-cc3346e320fe" containerName="extract" Apr 16 20:36:02.454614 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.454527 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1" containerName="extract" Apr 16 20:36:02.457245 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.457228 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cqqdz" Apr 16 20:36:02.461605 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.461576 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 20:36:02.461740 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.461618 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-rcwl4\"" Apr 16 20:36:02.471773 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.471746 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-cqqdz"] Apr 16 20:36:02.592529 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.592488 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng5kf\" (UniqueName: \"kubernetes.io/projected/918b6360-89b2-4953-9df8-a6962b44bb8f-kube-api-access-ng5kf\") pod \"dns-operator-controller-manager-648d5c98bc-cqqdz\" (UID: \"918b6360-89b2-4953-9df8-a6962b44bb8f\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cqqdz" Apr 16 20:36:02.693859 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.693825 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng5kf\" (UniqueName: \"kubernetes.io/projected/918b6360-89b2-4953-9df8-a6962b44bb8f-kube-api-access-ng5kf\") pod \"dns-operator-controller-manager-648d5c98bc-cqqdz\" (UID: \"918b6360-89b2-4953-9df8-a6962b44bb8f\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cqqdz" Apr 16 20:36:02.703145 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.703117 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng5kf\" (UniqueName: \"kubernetes.io/projected/918b6360-89b2-4953-9df8-a6962b44bb8f-kube-api-access-ng5kf\") pod \"dns-operator-controller-manager-648d5c98bc-cqqdz\" (UID: \"918b6360-89b2-4953-9df8-a6962b44bb8f\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cqqdz" Apr 16 20:36:02.767125 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.767041 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cqqdz" Apr 16 20:36:02.897124 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:02.897099 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-cqqdz"] Apr 16 20:36:02.898619 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:36:02.898594 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod918b6360_89b2_4953_9df8_a6962b44bb8f.slice/crio-a593ae8c1c8b0d1216fe37e8a751750462a7cac02e16792efac9eba67cfd3606 WatchSource:0}: Error finding container a593ae8c1c8b0d1216fe37e8a751750462a7cac02e16792efac9eba67cfd3606: Status 404 returned error can't find the container with id a593ae8c1c8b0d1216fe37e8a751750462a7cac02e16792efac9eba67cfd3606 Apr 16 20:36:03.123664 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:03.123583 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cqqdz" event={"ID":"918b6360-89b2-4953-9df8-a6962b44bb8f","Type":"ContainerStarted","Data":"a593ae8c1c8b0d1216fe37e8a751750462a7cac02e16792efac9eba67cfd3606"} Apr 16 20:36:05.932663 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:05.932629 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k"] Apr 16 20:36:05.936491 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:05.935882 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k" Apr 16 20:36:05.939952 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:05.939926 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-cjwzr\"" Apr 16 20:36:05.946404 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:05.946381 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k"] Apr 16 20:36:06.122984 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:06.122941 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggn6s\" (UniqueName: \"kubernetes.io/projected/eb62833c-fd46-488b-a614-43f7234b220f-kube-api-access-ggn6s\") pod \"limitador-operator-controller-manager-85c4996f8c-z8n9k\" (UID: \"eb62833c-fd46-488b-a614-43f7234b220f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k" Apr 16 20:36:06.139535 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:06.139491 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cqqdz" event={"ID":"918b6360-89b2-4953-9df8-a6962b44bb8f","Type":"ContainerStarted","Data":"4bda4572449d2ed6fdcce02be0aad6ac08907577dfd85fd1bbbe9b7dc81408c5"} Apr 16 20:36:06.139733 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:06.139609 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cqqdz" Apr 16 20:36:06.157377 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:06.157322 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cqqdz" podStartSLOduration=1.526153353 podStartE2EDuration="4.15730756s" podCreationTimestamp="2026-04-16 20:36:02 +0000 UTC" firstStartedPulling="2026-04-16 20:36:02.900515247 +0000 UTC m=+510.169435635" lastFinishedPulling="2026-04-16 20:36:05.531669454 +0000 UTC m=+512.800589842" observedRunningTime="2026-04-16 20:36:06.15550165 +0000 UTC m=+513.424422063" watchObservedRunningTime="2026-04-16 20:36:06.15730756 +0000 UTC m=+513.426227970" Apr 16 20:36:06.223588 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:06.223492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggn6s\" (UniqueName: \"kubernetes.io/projected/eb62833c-fd46-488b-a614-43f7234b220f-kube-api-access-ggn6s\") pod \"limitador-operator-controller-manager-85c4996f8c-z8n9k\" (UID: \"eb62833c-fd46-488b-a614-43f7234b220f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k" Apr 16 20:36:06.232108 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:06.232070 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggn6s\" (UniqueName: \"kubernetes.io/projected/eb62833c-fd46-488b-a614-43f7234b220f-kube-api-access-ggn6s\") pod \"limitador-operator-controller-manager-85c4996f8c-z8n9k\" (UID: \"eb62833c-fd46-488b-a614-43f7234b220f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k" Apr 16 20:36:06.249831 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:06.249799 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k" Apr 16 20:36:06.389440 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:06.389412 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k"] Apr 16 20:36:06.391132 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:36:06.391102 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb62833c_fd46_488b_a614_43f7234b220f.slice/crio-543d58e61481e509dd2a688b4efede3fa2168b118cf407b1907874726555a029 WatchSource:0}: Error finding container 543d58e61481e509dd2a688b4efede3fa2168b118cf407b1907874726555a029: Status 404 returned error can't find the container with id 543d58e61481e509dd2a688b4efede3fa2168b118cf407b1907874726555a029 Apr 16 20:36:07.148434 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:07.148397 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k" event={"ID":"eb62833c-fd46-488b-a614-43f7234b220f","Type":"ContainerStarted","Data":"543d58e61481e509dd2a688b4efede3fa2168b118cf407b1907874726555a029"} Apr 16 20:36:09.156628 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:09.156594 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k" event={"ID":"eb62833c-fd46-488b-a614-43f7234b220f","Type":"ContainerStarted","Data":"61cdf72b0e1f0423f1fa9b9d578024b9a2084770a5af7d0fde8264d2ebfe28a5"} Apr 16 20:36:09.157067 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:09.156686 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k" Apr 16 20:36:09.174157 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:09.174095 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k" podStartSLOduration=2.411921026 podStartE2EDuration="4.174074943s" podCreationTimestamp="2026-04-16 20:36:05 +0000 UTC" firstStartedPulling="2026-04-16 20:36:06.393009743 +0000 UTC m=+513.661930132" lastFinishedPulling="2026-04-16 20:36:08.155163659 +0000 UTC m=+515.424084049" observedRunningTime="2026-04-16 20:36:09.172526068 +0000 UTC m=+516.441446483" watchObservedRunningTime="2026-04-16 20:36:09.174074943 +0000 UTC m=+516.442995354" Apr 16 20:36:14.044059 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:14.044020 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-lhfd9"] Apr 16 20:36:14.047581 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:14.047561 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-lhfd9" Apr 16 20:36:14.051274 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:14.051252 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-d9ftp\"" Apr 16 20:36:14.057275 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:14.057250 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-lhfd9"] Apr 16 20:36:14.087130 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:14.087089 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v265\" (UniqueName: \"kubernetes.io/projected/6733ba7b-b41e-49aa-83a1-d9182b15b58b-kube-api-access-5v265\") pod \"authorino-operator-657f44b778-lhfd9\" (UID: \"6733ba7b-b41e-49aa-83a1-d9182b15b58b\") " pod="kuadrant-system/authorino-operator-657f44b778-lhfd9" Apr 16 20:36:14.188120 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:14.188082 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5v265\" (UniqueName: \"kubernetes.io/projected/6733ba7b-b41e-49aa-83a1-d9182b15b58b-kube-api-access-5v265\") pod \"authorino-operator-657f44b778-lhfd9\" (UID: \"6733ba7b-b41e-49aa-83a1-d9182b15b58b\") " pod="kuadrant-system/authorino-operator-657f44b778-lhfd9" Apr 16 20:36:14.198099 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:14.198063 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v265\" (UniqueName: \"kubernetes.io/projected/6733ba7b-b41e-49aa-83a1-d9182b15b58b-kube-api-access-5v265\") pod \"authorino-operator-657f44b778-lhfd9\" (UID: \"6733ba7b-b41e-49aa-83a1-d9182b15b58b\") " pod="kuadrant-system/authorino-operator-657f44b778-lhfd9" Apr 16 20:36:14.358514 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:14.358411 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-lhfd9" Apr 16 20:36:14.513007 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:14.512955 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-lhfd9"] Apr 16 20:36:14.515387 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:36:14.515349 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6733ba7b_b41e_49aa_83a1_d9182b15b58b.slice/crio-ad63678652e9c11abd557c734e6278d5a3f835255084c56237b0beb030a3d709 WatchSource:0}: Error finding container ad63678652e9c11abd557c734e6278d5a3f835255084c56237b0beb030a3d709: Status 404 returned error can't find the container with id ad63678652e9c11abd557c734e6278d5a3f835255084c56237b0beb030a3d709 Apr 16 20:36:15.182632 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:15.182588 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-lhfd9" event={"ID":"6733ba7b-b41e-49aa-83a1-d9182b15b58b","Type":"ContainerStarted","Data":"ad63678652e9c11abd557c734e6278d5a3f835255084c56237b0beb030a3d709"} Apr 16 20:36:17.151630 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:17.151545 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cqqdz" Apr 16 20:36:17.192070 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:17.192035 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-lhfd9" event={"ID":"6733ba7b-b41e-49aa-83a1-d9182b15b58b","Type":"ContainerStarted","Data":"88e4ce610eeaff947853253d2cbcf0b48a8d8358ed766f152a630ab567b0b674"} Apr 16 20:36:17.192253 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:17.192124 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-lhfd9" Apr 16 20:36:17.210917 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:17.210870 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-lhfd9" podStartSLOduration=1.018888267 podStartE2EDuration="3.210856198s" podCreationTimestamp="2026-04-16 20:36:14 +0000 UTC" firstStartedPulling="2026-04-16 20:36:14.517374869 +0000 UTC m=+521.786295271" lastFinishedPulling="2026-04-16 20:36:16.709342811 +0000 UTC m=+523.978263202" observedRunningTime="2026-04-16 20:36:17.208950465 +0000 UTC m=+524.477870886" watchObservedRunningTime="2026-04-16 20:36:17.210856198 +0000 UTC m=+524.479776607" Apr 16 20:36:20.162771 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:20.162733 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k" Apr 16 20:36:27.192668 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.192606 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-cbd54666c-prnm9" podUID="a47ef05d-c48a-4d72-8b8a-88348425f32f" containerName="console" containerID="cri-o://7bb09b7e88be9ea3f426de46048a0074feeb11abdc1e6efe4b68d854bc804f16" gracePeriod=15 Apr 16 20:36:27.435620 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.435598 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cbd54666c-prnm9_a47ef05d-c48a-4d72-8b8a-88348425f32f/console/0.log" Apr 16 20:36:27.435737 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.435661 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:36:27.502984 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.502897 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-config\") pod \"a47ef05d-c48a-4d72-8b8a-88348425f32f\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " Apr 16 20:36:27.502984 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.502936 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-oauth-config\") pod \"a47ef05d-c48a-4d72-8b8a-88348425f32f\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " Apr 16 20:36:27.503199 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.502992 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-trusted-ca-bundle\") pod \"a47ef05d-c48a-4d72-8b8a-88348425f32f\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " Apr 16 20:36:27.503199 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.503026 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-oauth-serving-cert\") pod \"a47ef05d-c48a-4d72-8b8a-88348425f32f\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " Apr 16 20:36:27.503199 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.503052 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-serving-cert\") pod \"a47ef05d-c48a-4d72-8b8a-88348425f32f\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " Apr 16 20:36:27.503199 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.503116 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-service-ca\") pod \"a47ef05d-c48a-4d72-8b8a-88348425f32f\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " Apr 16 20:36:27.503199 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.503168 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg9tc\" (UniqueName: \"kubernetes.io/projected/a47ef05d-c48a-4d72-8b8a-88348425f32f-kube-api-access-gg9tc\") pod \"a47ef05d-c48a-4d72-8b8a-88348425f32f\" (UID: \"a47ef05d-c48a-4d72-8b8a-88348425f32f\") " Apr 16 20:36:27.503450 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.503408 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-config" (OuterVolumeSpecName: "console-config") pod "a47ef05d-c48a-4d72-8b8a-88348425f32f" (UID: "a47ef05d-c48a-4d72-8b8a-88348425f32f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:36:27.503576 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.503552 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-service-ca" (OuterVolumeSpecName: "service-ca") pod "a47ef05d-c48a-4d72-8b8a-88348425f32f" (UID: "a47ef05d-c48a-4d72-8b8a-88348425f32f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:36:27.503643 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.503579 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a47ef05d-c48a-4d72-8b8a-88348425f32f" (UID: "a47ef05d-c48a-4d72-8b8a-88348425f32f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:36:27.503643 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.503630 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a47ef05d-c48a-4d72-8b8a-88348425f32f" (UID: "a47ef05d-c48a-4d72-8b8a-88348425f32f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:36:27.505178 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.505145 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a47ef05d-c48a-4d72-8b8a-88348425f32f" (UID: "a47ef05d-c48a-4d72-8b8a-88348425f32f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:36:27.505178 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.505160 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a47ef05d-c48a-4d72-8b8a-88348425f32f" (UID: "a47ef05d-c48a-4d72-8b8a-88348425f32f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:36:27.505314 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.505206 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a47ef05d-c48a-4d72-8b8a-88348425f32f-kube-api-access-gg9tc" (OuterVolumeSpecName: "kube-api-access-gg9tc") pod "a47ef05d-c48a-4d72-8b8a-88348425f32f" (UID: "a47ef05d-c48a-4d72-8b8a-88348425f32f"). InnerVolumeSpecName "kube-api-access-gg9tc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:36:27.603793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.603757 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-service-ca\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:36:27.603793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.603787 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gg9tc\" (UniqueName: \"kubernetes.io/projected/a47ef05d-c48a-4d72-8b8a-88348425f32f-kube-api-access-gg9tc\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:36:27.603793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.603801 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-config\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:36:27.604026 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.603810 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-oauth-config\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:36:27.604026 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.603818 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-trusted-ca-bundle\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:36:27.604026 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.603827 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a47ef05d-c48a-4d72-8b8a-88348425f32f-oauth-serving-cert\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:36:27.604026 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:27.603836 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a47ef05d-c48a-4d72-8b8a-88348425f32f-console-serving-cert\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:36:28.199084 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:28.199051 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-lhfd9" Apr 16 20:36:28.236766 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:28.236740 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cbd54666c-prnm9_a47ef05d-c48a-4d72-8b8a-88348425f32f/console/0.log" Apr 16 20:36:28.236932 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:28.236784 2577 generic.go:358] "Generic (PLEG): container finished" podID="a47ef05d-c48a-4d72-8b8a-88348425f32f" containerID="7bb09b7e88be9ea3f426de46048a0074feeb11abdc1e6efe4b68d854bc804f16" exitCode=2 Apr 16 20:36:28.236932 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:28.236871 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cbd54666c-prnm9" Apr 16 20:36:28.236932 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:28.236882 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cbd54666c-prnm9" event={"ID":"a47ef05d-c48a-4d72-8b8a-88348425f32f","Type":"ContainerDied","Data":"7bb09b7e88be9ea3f426de46048a0074feeb11abdc1e6efe4b68d854bc804f16"} Apr 16 20:36:28.236932 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:28.236917 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cbd54666c-prnm9" event={"ID":"a47ef05d-c48a-4d72-8b8a-88348425f32f","Type":"ContainerDied","Data":"a3f658bc72583f3f4039a44e9898641df55c943d407f66818dd5b51091185372"} Apr 16 20:36:28.237149 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:28.236936 2577 scope.go:117] "RemoveContainer" containerID="7bb09b7e88be9ea3f426de46048a0074feeb11abdc1e6efe4b68d854bc804f16" Apr 16 20:36:28.246885 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:28.246424 2577 scope.go:117] "RemoveContainer" containerID="7bb09b7e88be9ea3f426de46048a0074feeb11abdc1e6efe4b68d854bc804f16" Apr 16 20:36:28.246885 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:36:28.246822 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bb09b7e88be9ea3f426de46048a0074feeb11abdc1e6efe4b68d854bc804f16\": container with ID starting with 7bb09b7e88be9ea3f426de46048a0074feeb11abdc1e6efe4b68d854bc804f16 not found: ID does not exist" containerID="7bb09b7e88be9ea3f426de46048a0074feeb11abdc1e6efe4b68d854bc804f16" Apr 16 20:36:28.246885 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:28.246856 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bb09b7e88be9ea3f426de46048a0074feeb11abdc1e6efe4b68d854bc804f16"} err="failed to get container status \"7bb09b7e88be9ea3f426de46048a0074feeb11abdc1e6efe4b68d854bc804f16\": rpc error: code = NotFound desc = could not find container \"7bb09b7e88be9ea3f426de46048a0074feeb11abdc1e6efe4b68d854bc804f16\": container with ID starting with 7bb09b7e88be9ea3f426de46048a0074feeb11abdc1e6efe4b68d854bc804f16 not found: ID does not exist" Apr 16 20:36:28.264322 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:28.264297 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cbd54666c-prnm9"] Apr 16 20:36:28.269380 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:28.269358 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-cbd54666c-prnm9"] Apr 16 20:36:29.285675 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.285639 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a47ef05d-c48a-4d72-8b8a-88348425f32f" path="/var/lib/kubelet/pods/a47ef05d-c48a-4d72-8b8a-88348425f32f/volumes" Apr 16 20:36:29.293478 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.293441 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k"] Apr 16 20:36:29.293704 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.293681 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k" podUID="eb62833c-fd46-488b-a614-43f7234b220f" containerName="manager" containerID="cri-o://61cdf72b0e1f0423f1fa9b9d578024b9a2084770a5af7d0fde8264d2ebfe28a5" gracePeriod=2 Apr 16 20:36:29.310040 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.310015 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k"] Apr 16 20:36:29.346840 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.346812 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vq99q"] Apr 16 20:36:29.347172 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.347156 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a47ef05d-c48a-4d72-8b8a-88348425f32f" containerName="console" Apr 16 20:36:29.347217 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.347175 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a47ef05d-c48a-4d72-8b8a-88348425f32f" containerName="console" Apr 16 20:36:29.347217 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.347197 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb62833c-fd46-488b-a614-43f7234b220f" containerName="manager" Apr 16 20:36:29.347217 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.347203 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb62833c-fd46-488b-a614-43f7234b220f" containerName="manager" Apr 16 20:36:29.347322 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.347253 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a47ef05d-c48a-4d72-8b8a-88348425f32f" containerName="console" Apr 16 20:36:29.347322 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.347260 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb62833c-fd46-488b-a614-43f7234b220f" containerName="manager" Apr 16 20:36:29.350297 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.350279 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vq99q" Apr 16 20:36:29.367051 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.367026 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vq99q"] Apr 16 20:36:29.394582 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.394549 2577 status_manager.go:895] "Failed to get status for pod" podUID="eb62833c-fd46-488b-a614-43f7234b220f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k" err="pods \"limitador-operator-controller-manager-85c4996f8c-z8n9k\" is forbidden: User \"system:node:ip-10-0-132-101.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-101.ec2.internal' and this object" Apr 16 20:36:29.418772 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.418746 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjbjj\" (UniqueName: \"kubernetes.io/projected/1aa88fac-6e4d-43a9-b227-e0c2f32adc1d-kube-api-access-fjbjj\") pod \"limitador-operator-controller-manager-85c4996f8c-vq99q\" (UID: \"1aa88fac-6e4d-43a9-b227-e0c2f32adc1d\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vq99q" Apr 16 20:36:29.519397 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.519365 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjbjj\" (UniqueName: \"kubernetes.io/projected/1aa88fac-6e4d-43a9-b227-e0c2f32adc1d-kube-api-access-fjbjj\") pod \"limitador-operator-controller-manager-85c4996f8c-vq99q\" (UID: \"1aa88fac-6e4d-43a9-b227-e0c2f32adc1d\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vq99q" Apr 16 20:36:29.524449 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.524429 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k" Apr 16 20:36:29.526986 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.526961 2577 status_manager.go:895] "Failed to get status for pod" podUID="eb62833c-fd46-488b-a614-43f7234b220f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k" err="pods \"limitador-operator-controller-manager-85c4996f8c-z8n9k\" is forbidden: User \"system:node:ip-10-0-132-101.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-101.ec2.internal' and this object" Apr 16 20:36:29.539805 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.539742 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjbjj\" (UniqueName: \"kubernetes.io/projected/1aa88fac-6e4d-43a9-b227-e0c2f32adc1d-kube-api-access-fjbjj\") pod \"limitador-operator-controller-manager-85c4996f8c-vq99q\" (UID: \"1aa88fac-6e4d-43a9-b227-e0c2f32adc1d\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vq99q" Apr 16 20:36:29.620371 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.620336 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggn6s\" (UniqueName: \"kubernetes.io/projected/eb62833c-fd46-488b-a614-43f7234b220f-kube-api-access-ggn6s\") pod \"eb62833c-fd46-488b-a614-43f7234b220f\" (UID: \"eb62833c-fd46-488b-a614-43f7234b220f\") " Apr 16 20:36:29.622297 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.622274 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb62833c-fd46-488b-a614-43f7234b220f-kube-api-access-ggn6s" (OuterVolumeSpecName: "kube-api-access-ggn6s") pod "eb62833c-fd46-488b-a614-43f7234b220f" (UID: "eb62833c-fd46-488b-a614-43f7234b220f"). InnerVolumeSpecName "kube-api-access-ggn6s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:36:29.674779 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.674745 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vq99q" Apr 16 20:36:29.722210 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.722049 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ggn6s\" (UniqueName: \"kubernetes.io/projected/eb62833c-fd46-488b-a614-43f7234b220f-kube-api-access-ggn6s\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:36:29.811384 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:29.811355 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vq99q"] Apr 16 20:36:29.812885 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:36:29.812857 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aa88fac_6e4d_43a9_b227_e0c2f32adc1d.slice/crio-b7a1a493a86679d62fccb7c9200f5a72dd1bba93d05e595d7f755b2c07c02c1b WatchSource:0}: Error finding container b7a1a493a86679d62fccb7c9200f5a72dd1bba93d05e595d7f755b2c07c02c1b: Status 404 returned error can't find the container with id b7a1a493a86679d62fccb7c9200f5a72dd1bba93d05e595d7f755b2c07c02c1b Apr 16 20:36:30.246664 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:30.246620 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vq99q" event={"ID":"1aa88fac-6e4d-43a9-b227-e0c2f32adc1d","Type":"ContainerStarted","Data":"155d8782d5ef88bd12f183522f2f5a9a8f394c421b3e35693af076180edb7837"} Apr 16 20:36:30.246846 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:30.246670 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vq99q" event={"ID":"1aa88fac-6e4d-43a9-b227-e0c2f32adc1d","Type":"ContainerStarted","Data":"b7a1a493a86679d62fccb7c9200f5a72dd1bba93d05e595d7f755b2c07c02c1b"} Apr 16 20:36:30.246899 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:30.246879 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vq99q" Apr 16 20:36:30.247773 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:30.247746 2577 generic.go:358] "Generic (PLEG): container finished" podID="eb62833c-fd46-488b-a614-43f7234b220f" containerID="61cdf72b0e1f0423f1fa9b9d578024b9a2084770a5af7d0fde8264d2ebfe28a5" exitCode=0 Apr 16 20:36:30.247871 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:30.247801 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k" Apr 16 20:36:30.247871 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:30.247828 2577 scope.go:117] "RemoveContainer" containerID="61cdf72b0e1f0423f1fa9b9d578024b9a2084770a5af7d0fde8264d2ebfe28a5" Apr 16 20:36:30.256985 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:30.256967 2577 scope.go:117] "RemoveContainer" containerID="61cdf72b0e1f0423f1fa9b9d578024b9a2084770a5af7d0fde8264d2ebfe28a5" Apr 16 20:36:30.257248 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:36:30.257229 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61cdf72b0e1f0423f1fa9b9d578024b9a2084770a5af7d0fde8264d2ebfe28a5\": container with ID starting with 61cdf72b0e1f0423f1fa9b9d578024b9a2084770a5af7d0fde8264d2ebfe28a5 not found: ID does not exist" containerID="61cdf72b0e1f0423f1fa9b9d578024b9a2084770a5af7d0fde8264d2ebfe28a5" Apr 16 20:36:30.257301 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:30.257260 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61cdf72b0e1f0423f1fa9b9d578024b9a2084770a5af7d0fde8264d2ebfe28a5"} err="failed to get container status \"61cdf72b0e1f0423f1fa9b9d578024b9a2084770a5af7d0fde8264d2ebfe28a5\": rpc error: code = NotFound desc = could not find container \"61cdf72b0e1f0423f1fa9b9d578024b9a2084770a5af7d0fde8264d2ebfe28a5\": container with ID starting with 61cdf72b0e1f0423f1fa9b9d578024b9a2084770a5af7d0fde8264d2ebfe28a5 not found: ID does not exist" Apr 16 20:36:30.268038 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:30.268009 2577 status_manager.go:895] "Failed to get status for pod" podUID="eb62833c-fd46-488b-a614-43f7234b220f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k" err="pods \"limitador-operator-controller-manager-85c4996f8c-z8n9k\" is forbidden: User \"system:node:ip-10-0-132-101.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-101.ec2.internal' and this object" Apr 16 20:36:30.268365 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:30.268330 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vq99q" podStartSLOduration=1.268317799 podStartE2EDuration="1.268317799s" podCreationTimestamp="2026-04-16 20:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:36:30.266074312 +0000 UTC m=+537.534994733" watchObservedRunningTime="2026-04-16 20:36:30.268317799 +0000 UTC m=+537.537238273" Apr 16 20:36:30.269926 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:30.269895 2577 status_manager.go:895] "Failed to get status for pod" podUID="eb62833c-fd46-488b-a614-43f7234b220f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-z8n9k" err="pods \"limitador-operator-controller-manager-85c4996f8c-z8n9k\" is forbidden: User \"system:node:ip-10-0-132-101.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-101.ec2.internal' and this object" Apr 16 20:36:31.284099 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:31.284052 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb62833c-fd46-488b-a614-43f7234b220f" path="/var/lib/kubelet/pods/eb62833c-fd46-488b-a614-43f7234b220f/volumes" Apr 16 20:36:41.255358 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:41.255326 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vq99q" Apr 16 20:36:58.612429 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.612387 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv"] Apr 16 20:36:58.617319 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.617284 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.619853 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.619817 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-w64lk\"" Apr 16 20:36:58.632230 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.632201 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv"] Apr 16 20:36:58.773729 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.773689 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.773729 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.773727 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.773958 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.773747 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.773958 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.773854 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.773958 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.773895 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.773958 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.773922 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.774105 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.773960 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.774105 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.773994 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.774105 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.774049 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cltt7\" (UniqueName: \"kubernetes.io/projected/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-kube-api-access-cltt7\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.874720 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.874634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.874720 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.874696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cltt7\" (UniqueName: \"kubernetes.io/projected/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-kube-api-access-cltt7\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.874959 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.874742 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.874959 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.874763 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.874959 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.874787 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.874959 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.874834 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.875170 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.874957 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.875170 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.875010 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.875170 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.875044 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.875170 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.875143 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.875387 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.875216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.875387 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.875247 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.875387 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.875362 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.875531 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.875514 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.877299 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.877277 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.877491 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.877453 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.882098 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.882077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.882298 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.882278 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cltt7\" (UniqueName: \"kubernetes.io/projected/1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a-kube-api-access-cltt7\") pod \"maas-default-gateway-openshift-default-58b6f876-pjmcv\" (UID: \"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:58.931014 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:58.930979 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:36:59.062496 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:59.062455 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv"] Apr 16 20:36:59.064580 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:36:59.064554 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ad42fac_c0e0_4e58_bc1c_bb9fbf5bc66a.slice/crio-70cbd3a78115260dda8c7b1e2afafa961ee33c2e3e9ccc03cbafedac404b8f0a WatchSource:0}: Error finding container 70cbd3a78115260dda8c7b1e2afafa961ee33c2e3e9ccc03cbafedac404b8f0a: Status 404 returned error can't find the container with id 70cbd3a78115260dda8c7b1e2afafa961ee33c2e3e9ccc03cbafedac404b8f0a Apr 16 20:36:59.066801 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:59.066773 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 20:36:59.066864 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:59.066837 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 20:36:59.066901 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:59.066864 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 20:36:59.360349 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:59.360312 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" event={"ID":"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a","Type":"ContainerStarted","Data":"8c188dcf20e6d3cb7053e304420450794db8fd3754beb6c1b1a6bf855abf517a"} Apr 16 20:36:59.360349 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:59.360351 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" event={"ID":"1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a","Type":"ContainerStarted","Data":"70cbd3a78115260dda8c7b1e2afafa961ee33c2e3e9ccc03cbafedac404b8f0a"} Apr 16 20:36:59.382743 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:59.382615 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" podStartSLOduration=1.382595561 podStartE2EDuration="1.382595561s" podCreationTimestamp="2026-04-16 20:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:36:59.380881365 +0000 UTC m=+566.649801776" watchObservedRunningTime="2026-04-16 20:36:59.382595561 +0000 UTC m=+566.651515972" Apr 16 20:36:59.931624 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:36:59.931579 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:37:00.936214 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:00.936176 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:37:01.368044 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:01.367958 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:37:01.368928 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:01.368909 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-pjmcv" Apr 16 20:37:03.691732 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:03.691700 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8s7w5"] Apr 16 20:37:03.694163 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:03.694136 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-8s7w5" Apr 16 20:37:03.696671 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:03.696653 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-d4w5m\"" Apr 16 20:37:03.704132 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:03.704108 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8s7w5"] Apr 16 20:37:03.821098 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:03.821061 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwbrl\" (UniqueName: \"kubernetes.io/projected/d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9-kube-api-access-xwbrl\") pod \"authorino-f99f4b5cd-8s7w5\" (UID: \"d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9\") " pod="kuadrant-system/authorino-f99f4b5cd-8s7w5" Apr 16 20:37:03.823352 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:03.823331 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-v6sj4"] Apr 16 20:37:03.825548 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:03.825533 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-v6sj4" Apr 16 20:37:03.832970 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:03.832947 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-v6sj4"] Apr 16 20:37:03.921655 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:03.921624 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwbrl\" (UniqueName: \"kubernetes.io/projected/d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9-kube-api-access-xwbrl\") pod \"authorino-f99f4b5cd-8s7w5\" (UID: \"d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9\") " pod="kuadrant-system/authorino-f99f4b5cd-8s7w5" Apr 16 20:37:03.921818 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:03.921706 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnkh8\" (UniqueName: \"kubernetes.io/projected/8d8a4b51-138d-4cfe-b48b-6867912c5163-kube-api-access-xnkh8\") pod \"authorino-7498df8756-v6sj4\" (UID: \"8d8a4b51-138d-4cfe-b48b-6867912c5163\") " pod="kuadrant-system/authorino-7498df8756-v6sj4" Apr 16 20:37:03.929171 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:03.929146 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwbrl\" (UniqueName: \"kubernetes.io/projected/d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9-kube-api-access-xwbrl\") pod \"authorino-f99f4b5cd-8s7w5\" (UID: \"d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9\") " pod="kuadrant-system/authorino-f99f4b5cd-8s7w5" Apr 16 20:37:04.005177 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:04.005095 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-8s7w5" Apr 16 20:37:04.022144 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:04.022114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnkh8\" (UniqueName: \"kubernetes.io/projected/8d8a4b51-138d-4cfe-b48b-6867912c5163-kube-api-access-xnkh8\") pod \"authorino-7498df8756-v6sj4\" (UID: \"8d8a4b51-138d-4cfe-b48b-6867912c5163\") " pod="kuadrant-system/authorino-7498df8756-v6sj4" Apr 16 20:37:04.031246 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:04.031217 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnkh8\" (UniqueName: \"kubernetes.io/projected/8d8a4b51-138d-4cfe-b48b-6867912c5163-kube-api-access-xnkh8\") pod \"authorino-7498df8756-v6sj4\" (UID: \"8d8a4b51-138d-4cfe-b48b-6867912c5163\") " pod="kuadrant-system/authorino-7498df8756-v6sj4" Apr 16 20:37:04.127155 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:04.127128 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8s7w5"] Apr 16 20:37:04.129572 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:37:04.129540 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd46f4dcb_0bd2_423b_aa1e_d0881aadb9f9.slice/crio-df5c3d39bdf9e91eb72b64a458641ee049f6be001e47cde7f1081125ab4806c3 WatchSource:0}: Error finding container df5c3d39bdf9e91eb72b64a458641ee049f6be001e47cde7f1081125ab4806c3: Status 404 returned error can't find the container with id df5c3d39bdf9e91eb72b64a458641ee049f6be001e47cde7f1081125ab4806c3 Apr 16 20:37:04.135311 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:04.135287 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-v6sj4" Apr 16 20:37:04.255524 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:04.255447 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-v6sj4"] Apr 16 20:37:04.257092 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:37:04.257062 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d8a4b51_138d_4cfe_b48b_6867912c5163.slice/crio-f5424844a6cec216d745f446fb3330fe44e35cae388624c9676acdbbba14a944 WatchSource:0}: Error finding container f5424844a6cec216d745f446fb3330fe44e35cae388624c9676acdbbba14a944: Status 404 returned error can't find the container with id f5424844a6cec216d745f446fb3330fe44e35cae388624c9676acdbbba14a944 Apr 16 20:37:04.379089 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:04.379052 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-8s7w5" event={"ID":"d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9","Type":"ContainerStarted","Data":"df5c3d39bdf9e91eb72b64a458641ee049f6be001e47cde7f1081125ab4806c3"} Apr 16 20:37:04.380128 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:04.380105 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-v6sj4" event={"ID":"8d8a4b51-138d-4cfe-b48b-6867912c5163","Type":"ContainerStarted","Data":"f5424844a6cec216d745f446fb3330fe44e35cae388624c9676acdbbba14a944"} Apr 16 20:37:07.398327 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:07.398278 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-8s7w5" event={"ID":"d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9","Type":"ContainerStarted","Data":"1f360438e41945330286baeda22ef5cf6b00c789ba600d199f2f27a461064700"} Apr 16 20:37:07.399678 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:07.399650 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-v6sj4" event={"ID":"8d8a4b51-138d-4cfe-b48b-6867912c5163","Type":"ContainerStarted","Data":"09589d2265ed7500d7cacdf9097b4a7d6c242309e46d9afe68737047e709931d"} Apr 16 20:37:07.413351 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:07.413305 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-8s7w5" podStartSLOduration=2.062672426 podStartE2EDuration="4.413291259s" podCreationTimestamp="2026-04-16 20:37:03 +0000 UTC" firstStartedPulling="2026-04-16 20:37:04.130807204 +0000 UTC m=+571.399727592" lastFinishedPulling="2026-04-16 20:37:06.481426034 +0000 UTC m=+573.750346425" observedRunningTime="2026-04-16 20:37:07.412325788 +0000 UTC m=+574.681246199" watchObservedRunningTime="2026-04-16 20:37:07.413291259 +0000 UTC m=+574.682211708" Apr 16 20:37:07.430195 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:07.430145 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-v6sj4" podStartSLOduration=2.211939999 podStartE2EDuration="4.430131261s" podCreationTimestamp="2026-04-16 20:37:03 +0000 UTC" firstStartedPulling="2026-04-16 20:37:04.258526165 +0000 UTC m=+571.527446556" lastFinishedPulling="2026-04-16 20:37:06.476717429 +0000 UTC m=+573.745637818" observedRunningTime="2026-04-16 20:37:07.428191939 +0000 UTC m=+574.697112350" watchObservedRunningTime="2026-04-16 20:37:07.430131261 +0000 UTC m=+574.699051672" Apr 16 20:37:07.450916 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:07.450850 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8s7w5"] Apr 16 20:37:09.407901 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:09.407860 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-8s7w5" podUID="d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9" containerName="authorino" containerID="cri-o://1f360438e41945330286baeda22ef5cf6b00c789ba600d199f2f27a461064700" gracePeriod=30 Apr 16 20:37:09.642828 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:09.642804 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-8s7w5" Apr 16 20:37:09.776661 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:09.776625 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwbrl\" (UniqueName: \"kubernetes.io/projected/d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9-kube-api-access-xwbrl\") pod \"d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9\" (UID: \"d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9\") " Apr 16 20:37:09.783005 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:09.779503 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9-kube-api-access-xwbrl" (OuterVolumeSpecName: "kube-api-access-xwbrl") pod "d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9" (UID: "d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9"). InnerVolumeSpecName "kube-api-access-xwbrl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:37:09.878291 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:09.878245 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xwbrl\" (UniqueName: \"kubernetes.io/projected/d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9-kube-api-access-xwbrl\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:37:10.413243 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:10.413204 2577 generic.go:358] "Generic (PLEG): container finished" podID="d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9" containerID="1f360438e41945330286baeda22ef5cf6b00c789ba600d199f2f27a461064700" exitCode=0 Apr 16 20:37:10.413701 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:10.413252 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-8s7w5" event={"ID":"d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9","Type":"ContainerDied","Data":"1f360438e41945330286baeda22ef5cf6b00c789ba600d199f2f27a461064700"} Apr 16 20:37:10.413701 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:10.413271 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-8s7w5" Apr 16 20:37:10.413701 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:10.413284 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-8s7w5" event={"ID":"d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9","Type":"ContainerDied","Data":"df5c3d39bdf9e91eb72b64a458641ee049f6be001e47cde7f1081125ab4806c3"} Apr 16 20:37:10.413701 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:10.413303 2577 scope.go:117] "RemoveContainer" containerID="1f360438e41945330286baeda22ef5cf6b00c789ba600d199f2f27a461064700" Apr 16 20:37:10.422997 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:10.422970 2577 scope.go:117] "RemoveContainer" containerID="1f360438e41945330286baeda22ef5cf6b00c789ba600d199f2f27a461064700" Apr 16 20:37:10.423267 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:37:10.423242 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f360438e41945330286baeda22ef5cf6b00c789ba600d199f2f27a461064700\": container with ID starting with 1f360438e41945330286baeda22ef5cf6b00c789ba600d199f2f27a461064700 not found: ID does not exist" containerID="1f360438e41945330286baeda22ef5cf6b00c789ba600d199f2f27a461064700" Apr 16 20:37:10.423353 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:10.423274 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f360438e41945330286baeda22ef5cf6b00c789ba600d199f2f27a461064700"} err="failed to get container status \"1f360438e41945330286baeda22ef5cf6b00c789ba600d199f2f27a461064700\": rpc error: code = NotFound desc = could not find container \"1f360438e41945330286baeda22ef5cf6b00c789ba600d199f2f27a461064700\": container with ID starting with 1f360438e41945330286baeda22ef5cf6b00c789ba600d199f2f27a461064700 not found: ID does not exist" Apr 16 20:37:10.434116 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:10.434091 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8s7w5"] Apr 16 20:37:10.437869 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:10.437847 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8s7w5"] Apr 16 20:37:11.284285 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:11.284250 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9" path="/var/lib/kubelet/pods/d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9/volumes" Apr 16 20:37:33.227667 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:33.227641 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 20:37:33.228157 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:33.227641 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 20:37:38.067971 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.067934 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-bq5p5"] Apr 16 20:37:38.068369 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.068279 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9" containerName="authorino" Apr 16 20:37:38.068369 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.068290 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9" containerName="authorino" Apr 16 20:37:38.068446 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.068385 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d46f4dcb-0bd2-423b-aa1e-d0881aadb9f9" containerName="authorino" Apr 16 20:37:38.071115 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.071099 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-bq5p5" Apr 16 20:37:38.077216 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.077184 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-bq5p5"] Apr 16 20:37:38.109453 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.109423 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmbbx\" (UniqueName: \"kubernetes.io/projected/7f47980d-11b3-4134-830c-01238f5e7aee-kube-api-access-rmbbx\") pod \"authorino-8b475cf9f-bq5p5\" (UID: \"7f47980d-11b3-4134-830c-01238f5e7aee\") " pod="kuadrant-system/authorino-8b475cf9f-bq5p5" Apr 16 20:37:38.210921 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.210884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmbbx\" (UniqueName: \"kubernetes.io/projected/7f47980d-11b3-4134-830c-01238f5e7aee-kube-api-access-rmbbx\") pod \"authorino-8b475cf9f-bq5p5\" (UID: \"7f47980d-11b3-4134-830c-01238f5e7aee\") " pod="kuadrant-system/authorino-8b475cf9f-bq5p5" Apr 16 20:37:38.218162 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.218139 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmbbx\" (UniqueName: \"kubernetes.io/projected/7f47980d-11b3-4134-830c-01238f5e7aee-kube-api-access-rmbbx\") pod \"authorino-8b475cf9f-bq5p5\" (UID: \"7f47980d-11b3-4134-830c-01238f5e7aee\") " pod="kuadrant-system/authorino-8b475cf9f-bq5p5" Apr 16 20:37:38.310290 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.310248 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-bq5p5"] Apr 16 20:37:38.310628 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.310610 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-bq5p5" Apr 16 20:37:38.334622 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.334591 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-746bd9d694-hts9h"] Apr 16 20:37:38.340539 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.338705 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-746bd9d694-hts9h" Apr 16 20:37:38.349658 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.349599 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-746bd9d694-hts9h"] Apr 16 20:37:38.416859 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.412990 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zjwp\" (UniqueName: \"kubernetes.io/projected/9f182864-ca00-4506-aba3-520a6edb10c0-kube-api-access-4zjwp\") pod \"authorino-746bd9d694-hts9h\" (UID: \"9f182864-ca00-4506-aba3-520a6edb10c0\") " pod="kuadrant-system/authorino-746bd9d694-hts9h" Apr 16 20:37:38.481206 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.481177 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-bq5p5"] Apr 16 20:37:38.483635 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:37:38.483603 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f47980d_11b3_4134_830c_01238f5e7aee.slice/crio-168e01d2d814de535cf586bc3bd70a325bfea06473e533ea66f46de6b5155410 WatchSource:0}: Error finding container 168e01d2d814de535cf586bc3bd70a325bfea06473e533ea66f46de6b5155410: Status 404 returned error can't find the container with id 168e01d2d814de535cf586bc3bd70a325bfea06473e533ea66f46de6b5155410 Apr 16 20:37:38.514476 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.514439 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zjwp\" (UniqueName: \"kubernetes.io/projected/9f182864-ca00-4506-aba3-520a6edb10c0-kube-api-access-4zjwp\") pod \"authorino-746bd9d694-hts9h\" (UID: \"9f182864-ca00-4506-aba3-520a6edb10c0\") " pod="kuadrant-system/authorino-746bd9d694-hts9h" Apr 16 20:37:38.523654 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.523623 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zjwp\" (UniqueName: \"kubernetes.io/projected/9f182864-ca00-4506-aba3-520a6edb10c0-kube-api-access-4zjwp\") pod \"authorino-746bd9d694-hts9h\" (UID: \"9f182864-ca00-4506-aba3-520a6edb10c0\") " pod="kuadrant-system/authorino-746bd9d694-hts9h" Apr 16 20:37:38.528553 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.528526 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-746bd9d694-hts9h"] Apr 16 20:37:38.528892 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.528876 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-746bd9d694-hts9h" Apr 16 20:37:38.532121 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.532093 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-bq5p5" event={"ID":"7f47980d-11b3-4134-830c-01238f5e7aee","Type":"ContainerStarted","Data":"168e01d2d814de535cf586bc3bd70a325bfea06473e533ea66f46de6b5155410"} Apr 16 20:37:38.565850 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.564760 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5d8d5c7698-sxm55"] Apr 16 20:37:38.568770 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.568738 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5d8d5c7698-sxm55"] Apr 16 20:37:38.568928 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.568781 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5d8d5c7698-sxm55" Apr 16 20:37:38.571292 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.571231 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 20:37:38.615714 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.615687 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jqjr\" (UniqueName: \"kubernetes.io/projected/addb2fa1-4f6c-469d-8c41-08187434453f-kube-api-access-6jqjr\") pod \"authorino-5d8d5c7698-sxm55\" (UID: \"addb2fa1-4f6c-469d-8c41-08187434453f\") " pod="kuadrant-system/authorino-5d8d5c7698-sxm55" Apr 16 20:37:38.615881 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.615772 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/addb2fa1-4f6c-469d-8c41-08187434453f-tls-cert\") pod \"authorino-5d8d5c7698-sxm55\" (UID: \"addb2fa1-4f6c-469d-8c41-08187434453f\") " pod="kuadrant-system/authorino-5d8d5c7698-sxm55" Apr 16 20:37:38.695566 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.695540 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-746bd9d694-hts9h"] Apr 16 20:37:38.697452 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:37:38.697428 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f182864_ca00_4506_aba3_520a6edb10c0.slice/crio-1643a677b7e91b9d50ebb0bcd76905f4e82431e02d1e8b464da91b5f0cfd9f89 WatchSource:0}: Error finding container 1643a677b7e91b9d50ebb0bcd76905f4e82431e02d1e8b464da91b5f0cfd9f89: Status 404 returned error can't find the container with id 1643a677b7e91b9d50ebb0bcd76905f4e82431e02d1e8b464da91b5f0cfd9f89 Apr 16 20:37:38.716516 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.716486 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/addb2fa1-4f6c-469d-8c41-08187434453f-tls-cert\") pod \"authorino-5d8d5c7698-sxm55\" (UID: \"addb2fa1-4f6c-469d-8c41-08187434453f\") " pod="kuadrant-system/authorino-5d8d5c7698-sxm55" Apr 16 20:37:38.716647 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.716541 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jqjr\" (UniqueName: \"kubernetes.io/projected/addb2fa1-4f6c-469d-8c41-08187434453f-kube-api-access-6jqjr\") pod \"authorino-5d8d5c7698-sxm55\" (UID: \"addb2fa1-4f6c-469d-8c41-08187434453f\") " pod="kuadrant-system/authorino-5d8d5c7698-sxm55" Apr 16 20:37:38.718843 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.718820 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/addb2fa1-4f6c-469d-8c41-08187434453f-tls-cert\") pod \"authorino-5d8d5c7698-sxm55\" (UID: \"addb2fa1-4f6c-469d-8c41-08187434453f\") " pod="kuadrant-system/authorino-5d8d5c7698-sxm55" Apr 16 20:37:38.724391 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.724371 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jqjr\" (UniqueName: \"kubernetes.io/projected/addb2fa1-4f6c-469d-8c41-08187434453f-kube-api-access-6jqjr\") pod \"authorino-5d8d5c7698-sxm55\" (UID: \"addb2fa1-4f6c-469d-8c41-08187434453f\") " pod="kuadrant-system/authorino-5d8d5c7698-sxm55" Apr 16 20:37:38.880448 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:38.880412 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5d8d5c7698-sxm55" Apr 16 20:37:39.064505 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:37:39.064445 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaddb2fa1_4f6c_469d_8c41_08187434453f.slice/crio-7e29ac23e2b65acbe1549243dbbbfeb177dc3fe3796498bf237c0e7c0adff125 WatchSource:0}: Error finding container 7e29ac23e2b65acbe1549243dbbbfeb177dc3fe3796498bf237c0e7c0adff125: Status 404 returned error can't find the container with id 7e29ac23e2b65acbe1549243dbbbfeb177dc3fe3796498bf237c0e7c0adff125 Apr 16 20:37:39.064913 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:39.064882 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5d8d5c7698-sxm55"] Apr 16 20:37:39.549904 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:39.549848 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-bq5p5" event={"ID":"7f47980d-11b3-4134-830c-01238f5e7aee","Type":"ContainerStarted","Data":"f4647555c03eaeafb201fdec0bb3bab6011d3c9e18b502be59bf7d0a990707c9"} Apr 16 20:37:39.550393 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:39.550034 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-bq5p5" podUID="7f47980d-11b3-4134-830c-01238f5e7aee" containerName="authorino" containerID="cri-o://f4647555c03eaeafb201fdec0bb3bab6011d3c9e18b502be59bf7d0a990707c9" gracePeriod=30 Apr 16 20:37:39.552271 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:39.552208 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5d8d5c7698-sxm55" event={"ID":"addb2fa1-4f6c-469d-8c41-08187434453f","Type":"ContainerStarted","Data":"7e29ac23e2b65acbe1549243dbbbfeb177dc3fe3796498bf237c0e7c0adff125"} Apr 16 20:37:39.560192 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:39.560161 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-746bd9d694-hts9h" event={"ID":"9f182864-ca00-4506-aba3-520a6edb10c0","Type":"ContainerStarted","Data":"cb7e6359760428243bf9c8633065a40f113bf2d08bc927bb923974b9fa7a1b66"} Apr 16 20:37:39.560287 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:39.560205 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-746bd9d694-hts9h" event={"ID":"9f182864-ca00-4506-aba3-520a6edb10c0","Type":"ContainerStarted","Data":"1643a677b7e91b9d50ebb0bcd76905f4e82431e02d1e8b464da91b5f0cfd9f89"} Apr 16 20:37:39.560350 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:39.560331 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-746bd9d694-hts9h" podUID="9f182864-ca00-4506-aba3-520a6edb10c0" containerName="authorino" containerID="cri-o://cb7e6359760428243bf9c8633065a40f113bf2d08bc927bb923974b9fa7a1b66" gracePeriod=30 Apr 16 20:37:39.571815 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:39.571772 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-bq5p5" podStartSLOduration=1.280064167 podStartE2EDuration="1.571742711s" podCreationTimestamp="2026-04-16 20:37:38 +0000 UTC" firstStartedPulling="2026-04-16 20:37:38.485293008 +0000 UTC m=+605.754213396" lastFinishedPulling="2026-04-16 20:37:38.776971539 +0000 UTC m=+606.045891940" observedRunningTime="2026-04-16 20:37:39.568554469 +0000 UTC m=+606.837474878" watchObservedRunningTime="2026-04-16 20:37:39.571742711 +0000 UTC m=+606.840663122" Apr 16 20:37:39.583524 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:39.583452 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-746bd9d694-hts9h" podStartSLOduration=1.218543662 podStartE2EDuration="1.583438197s" podCreationTimestamp="2026-04-16 20:37:38 +0000 UTC" firstStartedPulling="2026-04-16 20:37:38.69880652 +0000 UTC m=+605.967726908" lastFinishedPulling="2026-04-16 20:37:39.063701043 +0000 UTC m=+606.332621443" observedRunningTime="2026-04-16 20:37:39.581190075 +0000 UTC m=+606.850110497" watchObservedRunningTime="2026-04-16 20:37:39.583438197 +0000 UTC m=+606.852358607" Apr 16 20:37:39.826012 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:39.825983 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-bq5p5" Apr 16 20:37:39.865487 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:39.860846 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-746bd9d694-hts9h" Apr 16 20:37:39.930198 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:39.930166 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zjwp\" (UniqueName: \"kubernetes.io/projected/9f182864-ca00-4506-aba3-520a6edb10c0-kube-api-access-4zjwp\") pod \"9f182864-ca00-4506-aba3-520a6edb10c0\" (UID: \"9f182864-ca00-4506-aba3-520a6edb10c0\") " Apr 16 20:37:39.930368 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:39.930270 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmbbx\" (UniqueName: \"kubernetes.io/projected/7f47980d-11b3-4134-830c-01238f5e7aee-kube-api-access-rmbbx\") pod \"7f47980d-11b3-4134-830c-01238f5e7aee\" (UID: \"7f47980d-11b3-4134-830c-01238f5e7aee\") " Apr 16 20:37:39.932159 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:39.932130 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f182864-ca00-4506-aba3-520a6edb10c0-kube-api-access-4zjwp" (OuterVolumeSpecName: "kube-api-access-4zjwp") pod "9f182864-ca00-4506-aba3-520a6edb10c0" (UID: "9f182864-ca00-4506-aba3-520a6edb10c0"). InnerVolumeSpecName "kube-api-access-4zjwp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:37:39.932290 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:39.932277 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f47980d-11b3-4134-830c-01238f5e7aee-kube-api-access-rmbbx" (OuterVolumeSpecName: "kube-api-access-rmbbx") pod "7f47980d-11b3-4134-830c-01238f5e7aee" (UID: "7f47980d-11b3-4134-830c-01238f5e7aee"). InnerVolumeSpecName "kube-api-access-rmbbx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:37:40.030994 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.030915 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4zjwp\" (UniqueName: \"kubernetes.io/projected/9f182864-ca00-4506-aba3-520a6edb10c0-kube-api-access-4zjwp\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:37:40.030994 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.030945 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rmbbx\" (UniqueName: \"kubernetes.io/projected/7f47980d-11b3-4134-830c-01238f5e7aee-kube-api-access-rmbbx\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:37:40.567512 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.567474 2577 generic.go:358] "Generic (PLEG): container finished" podID="7f47980d-11b3-4134-830c-01238f5e7aee" containerID="f4647555c03eaeafb201fdec0bb3bab6011d3c9e18b502be59bf7d0a990707c9" exitCode=0 Apr 16 20:37:40.567973 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.567532 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-bq5p5" Apr 16 20:37:40.567973 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.567555 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-bq5p5" event={"ID":"7f47980d-11b3-4134-830c-01238f5e7aee","Type":"ContainerDied","Data":"f4647555c03eaeafb201fdec0bb3bab6011d3c9e18b502be59bf7d0a990707c9"} Apr 16 20:37:40.567973 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.567596 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-bq5p5" event={"ID":"7f47980d-11b3-4134-830c-01238f5e7aee","Type":"ContainerDied","Data":"168e01d2d814de535cf586bc3bd70a325bfea06473e533ea66f46de6b5155410"} Apr 16 20:37:40.567973 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.567615 2577 scope.go:117] "RemoveContainer" containerID="f4647555c03eaeafb201fdec0bb3bab6011d3c9e18b502be59bf7d0a990707c9" Apr 16 20:37:40.569116 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.569038 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5d8d5c7698-sxm55" event={"ID":"addb2fa1-4f6c-469d-8c41-08187434453f","Type":"ContainerStarted","Data":"596bc51cbc48624e3df8ccc44ba8af58aac25772f3c7be1eb1914fac1e79b0c6"} Apr 16 20:37:40.570286 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.570262 2577 generic.go:358] "Generic (PLEG): container finished" podID="9f182864-ca00-4506-aba3-520a6edb10c0" containerID="cb7e6359760428243bf9c8633065a40f113bf2d08bc927bb923974b9fa7a1b66" exitCode=0 Apr 16 20:37:40.570416 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.570300 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-746bd9d694-hts9h" Apr 16 20:37:40.570416 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.570303 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-746bd9d694-hts9h" event={"ID":"9f182864-ca00-4506-aba3-520a6edb10c0","Type":"ContainerDied","Data":"cb7e6359760428243bf9c8633065a40f113bf2d08bc927bb923974b9fa7a1b66"} Apr 16 20:37:40.570416 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.570405 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-746bd9d694-hts9h" event={"ID":"9f182864-ca00-4506-aba3-520a6edb10c0","Type":"ContainerDied","Data":"1643a677b7e91b9d50ebb0bcd76905f4e82431e02d1e8b464da91b5f0cfd9f89"} Apr 16 20:37:40.577865 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.577843 2577 scope.go:117] "RemoveContainer" containerID="f4647555c03eaeafb201fdec0bb3bab6011d3c9e18b502be59bf7d0a990707c9" Apr 16 20:37:40.578152 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:37:40.578133 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4647555c03eaeafb201fdec0bb3bab6011d3c9e18b502be59bf7d0a990707c9\": container with ID starting with f4647555c03eaeafb201fdec0bb3bab6011d3c9e18b502be59bf7d0a990707c9 not found: ID does not exist" containerID="f4647555c03eaeafb201fdec0bb3bab6011d3c9e18b502be59bf7d0a990707c9" Apr 16 20:37:40.578214 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.578161 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4647555c03eaeafb201fdec0bb3bab6011d3c9e18b502be59bf7d0a990707c9"} err="failed to get container status \"f4647555c03eaeafb201fdec0bb3bab6011d3c9e18b502be59bf7d0a990707c9\": rpc error: code = NotFound desc = could not find container \"f4647555c03eaeafb201fdec0bb3bab6011d3c9e18b502be59bf7d0a990707c9\": container with ID starting with f4647555c03eaeafb201fdec0bb3bab6011d3c9e18b502be59bf7d0a990707c9 not found: ID does not exist" Apr 16 20:37:40.578214 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.578177 2577 scope.go:117] "RemoveContainer" containerID="cb7e6359760428243bf9c8633065a40f113bf2d08bc927bb923974b9fa7a1b66" Apr 16 20:37:40.586314 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.586254 2577 scope.go:117] "RemoveContainer" containerID="cb7e6359760428243bf9c8633065a40f113bf2d08bc927bb923974b9fa7a1b66" Apr 16 20:37:40.586826 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:37:40.586799 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb7e6359760428243bf9c8633065a40f113bf2d08bc927bb923974b9fa7a1b66\": container with ID starting with cb7e6359760428243bf9c8633065a40f113bf2d08bc927bb923974b9fa7a1b66 not found: ID does not exist" containerID="cb7e6359760428243bf9c8633065a40f113bf2d08bc927bb923974b9fa7a1b66" Apr 16 20:37:40.586940 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.586836 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7e6359760428243bf9c8633065a40f113bf2d08bc927bb923974b9fa7a1b66"} err="failed to get container status \"cb7e6359760428243bf9c8633065a40f113bf2d08bc927bb923974b9fa7a1b66\": rpc error: code = NotFound desc = could not find container \"cb7e6359760428243bf9c8633065a40f113bf2d08bc927bb923974b9fa7a1b66\": container with ID starting with cb7e6359760428243bf9c8633065a40f113bf2d08bc927bb923974b9fa7a1b66 not found: ID does not exist" Apr 16 20:37:40.587641 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.587591 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5d8d5c7698-sxm55" podStartSLOduration=2.171273838 podStartE2EDuration="2.587574813s" podCreationTimestamp="2026-04-16 20:37:38 +0000 UTC" firstStartedPulling="2026-04-16 20:37:39.066075515 +0000 UTC m=+606.334995903" lastFinishedPulling="2026-04-16 20:37:39.482376478 +0000 UTC m=+606.751296878" observedRunningTime="2026-04-16 20:37:40.586708002 +0000 UTC m=+607.855628414" watchObservedRunningTime="2026-04-16 20:37:40.587574813 +0000 UTC m=+607.856495226" Apr 16 20:37:40.604869 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.603787 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-746bd9d694-hts9h"] Apr 16 20:37:40.616509 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.615665 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-746bd9d694-hts9h"] Apr 16 20:37:40.621279 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.620858 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-v6sj4"] Apr 16 20:37:40.621279 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.621164 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-v6sj4" podUID="8d8a4b51-138d-4cfe-b48b-6867912c5163" containerName="authorino" containerID="cri-o://09589d2265ed7500d7cacdf9097b4a7d6c242309e46d9afe68737047e709931d" gracePeriod=30 Apr 16 20:37:40.624546 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.624520 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-bq5p5"] Apr 16 20:37:40.627883 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.627861 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-bq5p5"] Apr 16 20:37:40.878278 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.878254 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-v6sj4" Apr 16 20:37:40.940444 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.940415 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnkh8\" (UniqueName: \"kubernetes.io/projected/8d8a4b51-138d-4cfe-b48b-6867912c5163-kube-api-access-xnkh8\") pod \"8d8a4b51-138d-4cfe-b48b-6867912c5163\" (UID: \"8d8a4b51-138d-4cfe-b48b-6867912c5163\") " Apr 16 20:37:40.942541 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:40.942515 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d8a4b51-138d-4cfe-b48b-6867912c5163-kube-api-access-xnkh8" (OuterVolumeSpecName: "kube-api-access-xnkh8") pod "8d8a4b51-138d-4cfe-b48b-6867912c5163" (UID: "8d8a4b51-138d-4cfe-b48b-6867912c5163"). InnerVolumeSpecName "kube-api-access-xnkh8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:37:41.041631 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:41.041596 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnkh8\" (UniqueName: \"kubernetes.io/projected/8d8a4b51-138d-4cfe-b48b-6867912c5163-kube-api-access-xnkh8\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:37:41.285863 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:41.285827 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f47980d-11b3-4134-830c-01238f5e7aee" path="/var/lib/kubelet/pods/7f47980d-11b3-4134-830c-01238f5e7aee/volumes" Apr 16 20:37:41.286157 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:41.286144 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f182864-ca00-4506-aba3-520a6edb10c0" path="/var/lib/kubelet/pods/9f182864-ca00-4506-aba3-520a6edb10c0/volumes" Apr 16 20:37:41.576932 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:41.576839 2577 generic.go:358] "Generic (PLEG): container finished" podID="8d8a4b51-138d-4cfe-b48b-6867912c5163" containerID="09589d2265ed7500d7cacdf9097b4a7d6c242309e46d9afe68737047e709931d" exitCode=0 Apr 16 20:37:41.576932 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:41.576885 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-v6sj4" Apr 16 20:37:41.577402 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:41.576928 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-v6sj4" event={"ID":"8d8a4b51-138d-4cfe-b48b-6867912c5163","Type":"ContainerDied","Data":"09589d2265ed7500d7cacdf9097b4a7d6c242309e46d9afe68737047e709931d"} Apr 16 20:37:41.577402 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:41.576961 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-v6sj4" event={"ID":"8d8a4b51-138d-4cfe-b48b-6867912c5163","Type":"ContainerDied","Data":"f5424844a6cec216d745f446fb3330fe44e35cae388624c9676acdbbba14a944"} Apr 16 20:37:41.577402 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:41.576977 2577 scope.go:117] "RemoveContainer" containerID="09589d2265ed7500d7cacdf9097b4a7d6c242309e46d9afe68737047e709931d" Apr 16 20:37:41.585585 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:41.585495 2577 scope.go:117] "RemoveContainer" containerID="09589d2265ed7500d7cacdf9097b4a7d6c242309e46d9afe68737047e709931d" Apr 16 20:37:41.585751 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:37:41.585732 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09589d2265ed7500d7cacdf9097b4a7d6c242309e46d9afe68737047e709931d\": container with ID starting with 09589d2265ed7500d7cacdf9097b4a7d6c242309e46d9afe68737047e709931d not found: ID does not exist" containerID="09589d2265ed7500d7cacdf9097b4a7d6c242309e46d9afe68737047e709931d" Apr 16 20:37:41.585791 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:41.585760 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09589d2265ed7500d7cacdf9097b4a7d6c242309e46d9afe68737047e709931d"} err="failed to get container status \"09589d2265ed7500d7cacdf9097b4a7d6c242309e46d9afe68737047e709931d\": rpc error: code = NotFound desc = could not find container \"09589d2265ed7500d7cacdf9097b4a7d6c242309e46d9afe68737047e709931d\": container with ID starting with 09589d2265ed7500d7cacdf9097b4a7d6c242309e46d9afe68737047e709931d not found: ID does not exist" Apr 16 20:37:41.596986 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:41.596960 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-v6sj4"] Apr 16 20:37:41.600598 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:41.600579 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-v6sj4"] Apr 16 20:37:43.285848 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:37:43.285813 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d8a4b51-138d-4cfe-b48b-6867912c5163" path="/var/lib/kubelet/pods/8d8a4b51-138d-4cfe-b48b-6867912c5163/volumes" Apr 16 20:39:50.940553 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:50.940514 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7df45c5567-ndg7p"] Apr 16 20:39:50.941155 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:50.941055 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f182864-ca00-4506-aba3-520a6edb10c0" containerName="authorino" Apr 16 20:39:50.941155 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:50.941073 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f182864-ca00-4506-aba3-520a6edb10c0" containerName="authorino" Apr 16 20:39:50.941155 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:50.941088 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d8a4b51-138d-4cfe-b48b-6867912c5163" containerName="authorino" Apr 16 20:39:50.941155 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:50.941097 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8a4b51-138d-4cfe-b48b-6867912c5163" containerName="authorino" Apr 16 20:39:50.941155 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:50.941120 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f47980d-11b3-4134-830c-01238f5e7aee" containerName="authorino" Apr 16 20:39:50.941155 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:50.941130 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f47980d-11b3-4134-830c-01238f5e7aee" containerName="authorino" Apr 16 20:39:50.941450 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:50.941221 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d8a4b51-138d-4cfe-b48b-6867912c5163" containerName="authorino" Apr 16 20:39:50.941450 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:50.941236 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f182864-ca00-4506-aba3-520a6edb10c0" containerName="authorino" Apr 16 20:39:50.941450 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:50.941249 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f47980d-11b3-4134-830c-01238f5e7aee" containerName="authorino" Apr 16 20:39:50.943364 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:50.943343 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7df45c5567-ndg7p" Apr 16 20:39:50.951118 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:50.951096 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7df45c5567-ndg7p"] Apr 16 20:39:50.971793 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:50.971764 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/13df1003-8d71-4254-9196-cb2a9b0369fe-tls-cert\") pod \"authorino-7df45c5567-ndg7p\" (UID: \"13df1003-8d71-4254-9196-cb2a9b0369fe\") " pod="kuadrant-system/authorino-7df45c5567-ndg7p" Apr 16 20:39:50.971911 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:50.971881 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvx6x\" (UniqueName: \"kubernetes.io/projected/13df1003-8d71-4254-9196-cb2a9b0369fe-kube-api-access-kvx6x\") pod \"authorino-7df45c5567-ndg7p\" (UID: \"13df1003-8d71-4254-9196-cb2a9b0369fe\") " pod="kuadrant-system/authorino-7df45c5567-ndg7p" Apr 16 20:39:51.073301 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:51.073263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvx6x\" (UniqueName: \"kubernetes.io/projected/13df1003-8d71-4254-9196-cb2a9b0369fe-kube-api-access-kvx6x\") pod \"authorino-7df45c5567-ndg7p\" (UID: \"13df1003-8d71-4254-9196-cb2a9b0369fe\") " pod="kuadrant-system/authorino-7df45c5567-ndg7p" Apr 16 20:39:51.073301 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:51.073307 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/13df1003-8d71-4254-9196-cb2a9b0369fe-tls-cert\") pod \"authorino-7df45c5567-ndg7p\" (UID: \"13df1003-8d71-4254-9196-cb2a9b0369fe\") " pod="kuadrant-system/authorino-7df45c5567-ndg7p" Apr 16 20:39:51.075890 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:51.075863 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/13df1003-8d71-4254-9196-cb2a9b0369fe-tls-cert\") pod \"authorino-7df45c5567-ndg7p\" (UID: \"13df1003-8d71-4254-9196-cb2a9b0369fe\") " pod="kuadrant-system/authorino-7df45c5567-ndg7p" Apr 16 20:39:51.082822 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:51.082802 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvx6x\" (UniqueName: \"kubernetes.io/projected/13df1003-8d71-4254-9196-cb2a9b0369fe-kube-api-access-kvx6x\") pod \"authorino-7df45c5567-ndg7p\" (UID: \"13df1003-8d71-4254-9196-cb2a9b0369fe\") " pod="kuadrant-system/authorino-7df45c5567-ndg7p" Apr 16 20:39:51.253678 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:51.253584 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7df45c5567-ndg7p" Apr 16 20:39:51.379532 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:51.379506 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7df45c5567-ndg7p"] Apr 16 20:39:51.380942 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:39:51.380915 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13df1003_8d71_4254_9196_cb2a9b0369fe.slice/crio-2ddcd32da07c62a4ade8c69b8d8253a793b207024e4268ac2049eb8cf6a3ea76 WatchSource:0}: Error finding container 2ddcd32da07c62a4ade8c69b8d8253a793b207024e4268ac2049eb8cf6a3ea76: Status 404 returned error can't find the container with id 2ddcd32da07c62a4ade8c69b8d8253a793b207024e4268ac2049eb8cf6a3ea76 Apr 16 20:39:51.382500 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:51.382483 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:39:52.094010 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:52.093972 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7df45c5567-ndg7p" event={"ID":"13df1003-8d71-4254-9196-cb2a9b0369fe","Type":"ContainerStarted","Data":"bd5e57831bafd4cc69f6e308f1508464119f9ad04b3f59d868c10aa70a5b9a0f"} Apr 16 20:39:52.094010 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:52.094015 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7df45c5567-ndg7p" event={"ID":"13df1003-8d71-4254-9196-cb2a9b0369fe","Type":"ContainerStarted","Data":"2ddcd32da07c62a4ade8c69b8d8253a793b207024e4268ac2049eb8cf6a3ea76"} Apr 16 20:39:52.114628 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:52.114528 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7df45c5567-ndg7p" podStartSLOduration=1.636145 podStartE2EDuration="2.114513554s" podCreationTimestamp="2026-04-16 20:39:50 +0000 UTC" firstStartedPulling="2026-04-16 20:39:51.382601525 +0000 UTC m=+738.651521913" lastFinishedPulling="2026-04-16 20:39:51.860970067 +0000 UTC m=+739.129890467" observedRunningTime="2026-04-16 20:39:52.11163928 +0000 UTC m=+739.380559693" watchObservedRunningTime="2026-04-16 20:39:52.114513554 +0000 UTC m=+739.383433963" Apr 16 20:39:52.154237 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:52.154164 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5d8d5c7698-sxm55"] Apr 16 20:39:52.154721 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:52.154661 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-5d8d5c7698-sxm55" podUID="addb2fa1-4f6c-469d-8c41-08187434453f" containerName="authorino" containerID="cri-o://596bc51cbc48624e3df8ccc44ba8af58aac25772f3c7be1eb1914fac1e79b0c6" gracePeriod=30 Apr 16 20:39:52.404519 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:52.404495 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5d8d5c7698-sxm55" Apr 16 20:39:52.485878 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:52.485843 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jqjr\" (UniqueName: \"kubernetes.io/projected/addb2fa1-4f6c-469d-8c41-08187434453f-kube-api-access-6jqjr\") pod \"addb2fa1-4f6c-469d-8c41-08187434453f\" (UID: \"addb2fa1-4f6c-469d-8c41-08187434453f\") " Apr 16 20:39:52.486051 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:52.485949 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/addb2fa1-4f6c-469d-8c41-08187434453f-tls-cert\") pod \"addb2fa1-4f6c-469d-8c41-08187434453f\" (UID: \"addb2fa1-4f6c-469d-8c41-08187434453f\") " Apr 16 20:39:52.488156 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:52.488122 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/addb2fa1-4f6c-469d-8c41-08187434453f-kube-api-access-6jqjr" (OuterVolumeSpecName: "kube-api-access-6jqjr") pod "addb2fa1-4f6c-469d-8c41-08187434453f" (UID: "addb2fa1-4f6c-469d-8c41-08187434453f"). InnerVolumeSpecName "kube-api-access-6jqjr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:39:52.496535 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:52.496508 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/addb2fa1-4f6c-469d-8c41-08187434453f-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "addb2fa1-4f6c-469d-8c41-08187434453f" (UID: "addb2fa1-4f6c-469d-8c41-08187434453f"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:39:52.587222 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:52.587189 2577 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/addb2fa1-4f6c-469d-8c41-08187434453f-tls-cert\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:39:52.587222 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:52.587220 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6jqjr\" (UniqueName: \"kubernetes.io/projected/addb2fa1-4f6c-469d-8c41-08187434453f-kube-api-access-6jqjr\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:39:53.099295 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:53.099257 2577 generic.go:358] "Generic (PLEG): container finished" podID="addb2fa1-4f6c-469d-8c41-08187434453f" containerID="596bc51cbc48624e3df8ccc44ba8af58aac25772f3c7be1eb1914fac1e79b0c6" exitCode=0 Apr 16 20:39:53.099711 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:53.099312 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5d8d5c7698-sxm55" Apr 16 20:39:53.099711 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:53.099348 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5d8d5c7698-sxm55" event={"ID":"addb2fa1-4f6c-469d-8c41-08187434453f","Type":"ContainerDied","Data":"596bc51cbc48624e3df8ccc44ba8af58aac25772f3c7be1eb1914fac1e79b0c6"} Apr 16 20:39:53.099711 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:53.099388 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5d8d5c7698-sxm55" event={"ID":"addb2fa1-4f6c-469d-8c41-08187434453f","Type":"ContainerDied","Data":"7e29ac23e2b65acbe1549243dbbbfeb177dc3fe3796498bf237c0e7c0adff125"} Apr 16 20:39:53.099711 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:53.099406 2577 scope.go:117] "RemoveContainer" containerID="596bc51cbc48624e3df8ccc44ba8af58aac25772f3c7be1eb1914fac1e79b0c6" Apr 16 20:39:53.108347 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:53.108331 2577 scope.go:117] "RemoveContainer" containerID="596bc51cbc48624e3df8ccc44ba8af58aac25772f3c7be1eb1914fac1e79b0c6" Apr 16 20:39:53.108632 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:39:53.108614 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"596bc51cbc48624e3df8ccc44ba8af58aac25772f3c7be1eb1914fac1e79b0c6\": container with ID starting with 596bc51cbc48624e3df8ccc44ba8af58aac25772f3c7be1eb1914fac1e79b0c6 not found: ID does not exist" containerID="596bc51cbc48624e3df8ccc44ba8af58aac25772f3c7be1eb1914fac1e79b0c6" Apr 16 20:39:53.108708 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:53.108642 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596bc51cbc48624e3df8ccc44ba8af58aac25772f3c7be1eb1914fac1e79b0c6"} err="failed to get container status \"596bc51cbc48624e3df8ccc44ba8af58aac25772f3c7be1eb1914fac1e79b0c6\": rpc error: code = NotFound desc = could not find container \"596bc51cbc48624e3df8ccc44ba8af58aac25772f3c7be1eb1914fac1e79b0c6\": container with ID starting with 596bc51cbc48624e3df8ccc44ba8af58aac25772f3c7be1eb1914fac1e79b0c6 not found: ID does not exist" Apr 16 20:39:53.120411 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:53.120384 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5d8d5c7698-sxm55"] Apr 16 20:39:53.123969 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:53.123948 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-5d8d5c7698-sxm55"] Apr 16 20:39:53.291379 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:39:53.291342 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="addb2fa1-4f6c-469d-8c41-08187434453f" path="/var/lib/kubelet/pods/addb2fa1-4f6c-469d-8c41-08187434453f/volumes" Apr 16 20:41:24.107273 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:41:24.107235 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-96d5c496f-7p77l"] Apr 16 20:41:24.107804 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:41:24.107604 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="addb2fa1-4f6c-469d-8c41-08187434453f" containerName="authorino" Apr 16 20:41:24.107804 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:41:24.107616 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="addb2fa1-4f6c-469d-8c41-08187434453f" containerName="authorino" Apr 16 20:41:24.107804 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:41:24.107687 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="addb2fa1-4f6c-469d-8c41-08187434453f" containerName="authorino" Apr 16 20:41:24.109829 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:41:24.109813 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-96d5c496f-7p77l" Apr 16 20:41:24.112308 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:41:24.112282 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-5kf7h\"" Apr 16 20:41:24.118210 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:41:24.118190 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-96d5c496f-7p77l"] Apr 16 20:41:24.200943 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:41:24.200906 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kstr\" (UniqueName: \"kubernetes.io/projected/d1c1e0c0-b0e7-45e7-9001-6252bc927725-kube-api-access-7kstr\") pod \"maas-controller-96d5c496f-7p77l\" (UID: \"d1c1e0c0-b0e7-45e7-9001-6252bc927725\") " pod="opendatahub/maas-controller-96d5c496f-7p77l" Apr 16 20:41:24.301920 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:41:24.301873 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kstr\" (UniqueName: \"kubernetes.io/projected/d1c1e0c0-b0e7-45e7-9001-6252bc927725-kube-api-access-7kstr\") pod \"maas-controller-96d5c496f-7p77l\" (UID: \"d1c1e0c0-b0e7-45e7-9001-6252bc927725\") " pod="opendatahub/maas-controller-96d5c496f-7p77l" Apr 16 20:41:24.310629 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:41:24.310602 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kstr\" (UniqueName: \"kubernetes.io/projected/d1c1e0c0-b0e7-45e7-9001-6252bc927725-kube-api-access-7kstr\") pod \"maas-controller-96d5c496f-7p77l\" (UID: \"d1c1e0c0-b0e7-45e7-9001-6252bc927725\") " pod="opendatahub/maas-controller-96d5c496f-7p77l" Apr 16 20:41:24.421103 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:41:24.421065 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-96d5c496f-7p77l" Apr 16 20:41:24.752548 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:41:24.752520 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-96d5c496f-7p77l"] Apr 16 20:41:25.464620 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:41:25.464580 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-96d5c496f-7p77l" event={"ID":"d1c1e0c0-b0e7-45e7-9001-6252bc927725","Type":"ContainerStarted","Data":"78a19d6f950cdf0fd4315a33c2e8bcab0ecf052c8122ac543686dc51b8bc9ec1"} Apr 16 20:41:27.474041 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:41:27.474005 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-96d5c496f-7p77l" event={"ID":"d1c1e0c0-b0e7-45e7-9001-6252bc927725","Type":"ContainerStarted","Data":"dea3b9ba2fd2d0fa940ac1cf0523b6e3fe7364ff3ccab2b68a37a91dd52aa877"} Apr 16 20:41:27.474443 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:41:27.474148 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-96d5c496f-7p77l" Apr 16 20:41:27.490996 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:41:27.490950 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-96d5c496f-7p77l" podStartSLOduration=1.302874141 podStartE2EDuration="3.490937484s" podCreationTimestamp="2026-04-16 20:41:24 +0000 UTC" firstStartedPulling="2026-04-16 20:41:24.757326219 +0000 UTC m=+832.026246621" lastFinishedPulling="2026-04-16 20:41:26.945389576 +0000 UTC m=+834.214309964" observedRunningTime="2026-04-16 20:41:27.489180793 +0000 UTC m=+834.758101198" watchObservedRunningTime="2026-04-16 20:41:27.490937484 +0000 UTC m=+834.759857893" Apr 16 20:41:38.483619 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:41:38.483586 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-96d5c496f-7p77l" Apr 16 20:42:33.258053 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:42:33.258027 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 20:42:33.259993 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:42:33.259968 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 20:45:00.143020 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:00.142989 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29606205-prjt4"] Apr 16 20:45:00.145123 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:00.145107 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" Apr 16 20:45:00.147646 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:00.147623 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-6dqgw\"" Apr 16 20:45:00.161556 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:00.161529 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606205-prjt4"] Apr 16 20:45:00.260107 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:00.260070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j24jd\" (UniqueName: \"kubernetes.io/projected/d1bb8b0c-5dc0-4325-9475-5118a5e4f336-kube-api-access-j24jd\") pod \"maas-api-key-cleanup-29606205-prjt4\" (UID: \"d1bb8b0c-5dc0-4325-9475-5118a5e4f336\") " pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" Apr 16 20:45:00.360804 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:00.360763 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j24jd\" (UniqueName: \"kubernetes.io/projected/d1bb8b0c-5dc0-4325-9475-5118a5e4f336-kube-api-access-j24jd\") pod \"maas-api-key-cleanup-29606205-prjt4\" (UID: \"d1bb8b0c-5dc0-4325-9475-5118a5e4f336\") " pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" Apr 16 20:45:00.369036 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:00.369006 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j24jd\" (UniqueName: \"kubernetes.io/projected/d1bb8b0c-5dc0-4325-9475-5118a5e4f336-kube-api-access-j24jd\") pod \"maas-api-key-cleanup-29606205-prjt4\" (UID: \"d1bb8b0c-5dc0-4325-9475-5118a5e4f336\") " pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" Apr 16 20:45:00.454753 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:00.454722 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" Apr 16 20:45:00.785910 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:00.785883 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606205-prjt4"] Apr 16 20:45:00.787802 ip-10-0-132-101 kubenswrapper[2577]: W0416 20:45:00.787776 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1bb8b0c_5dc0_4325_9475_5118a5e4f336.slice/crio-4c695fddeccb75b66dabd19cce8bfdcd3f6136feca291e95c02c4dd028f23cd0 WatchSource:0}: Error finding container 4c695fddeccb75b66dabd19cce8bfdcd3f6136feca291e95c02c4dd028f23cd0: Status 404 returned error can't find the container with id 4c695fddeccb75b66dabd19cce8bfdcd3f6136feca291e95c02c4dd028f23cd0 Apr 16 20:45:00.789341 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:00.789323 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:45:01.301359 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:01.301325 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" event={"ID":"d1bb8b0c-5dc0-4325-9475-5118a5e4f336","Type":"ContainerStarted","Data":"4c695fddeccb75b66dabd19cce8bfdcd3f6136feca291e95c02c4dd028f23cd0"} Apr 16 20:45:02.306740 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:02.306705 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" event={"ID":"d1bb8b0c-5dc0-4325-9475-5118a5e4f336","Type":"ContainerStarted","Data":"eb029fc2b0bf3d691627751bac311e8ff43e7a0908ec851a7a179272d8f08a4a"} Apr 16 20:45:02.324276 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:02.324215 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" podStartSLOduration=1.724018655 podStartE2EDuration="2.324200593s" podCreationTimestamp="2026-04-16 20:45:00 +0000 UTC" firstStartedPulling="2026-04-16 20:45:00.789503534 +0000 UTC m=+1048.058423923" lastFinishedPulling="2026-04-16 20:45:01.389685473 +0000 UTC m=+1048.658605861" observedRunningTime="2026-04-16 20:45:02.321679127 +0000 UTC m=+1049.590599746" watchObservedRunningTime="2026-04-16 20:45:02.324200593 +0000 UTC m=+1049.593121004" Apr 16 20:45:22.385372 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:22.385284 2577 generic.go:358] "Generic (PLEG): container finished" podID="d1bb8b0c-5dc0-4325-9475-5118a5e4f336" containerID="eb029fc2b0bf3d691627751bac311e8ff43e7a0908ec851a7a179272d8f08a4a" exitCode=6 Apr 16 20:45:22.385372 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:22.385355 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" event={"ID":"d1bb8b0c-5dc0-4325-9475-5118a5e4f336","Type":"ContainerDied","Data":"eb029fc2b0bf3d691627751bac311e8ff43e7a0908ec851a7a179272d8f08a4a"} Apr 16 20:45:22.385817 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:22.385660 2577 scope.go:117] "RemoveContainer" containerID="eb029fc2b0bf3d691627751bac311e8ff43e7a0908ec851a7a179272d8f08a4a" Apr 16 20:45:23.390537 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:23.390505 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" event={"ID":"d1bb8b0c-5dc0-4325-9475-5118a5e4f336","Type":"ContainerStarted","Data":"df5f7e8a7546546b506d99d755b8e3a41b63456b8b6119ddf79a3e682c92423b"} Apr 16 20:45:43.467314 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:43.467269 2577 generic.go:358] "Generic (PLEG): container finished" podID="d1bb8b0c-5dc0-4325-9475-5118a5e4f336" containerID="df5f7e8a7546546b506d99d755b8e3a41b63456b8b6119ddf79a3e682c92423b" exitCode=6 Apr 16 20:45:43.467808 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:43.467327 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" event={"ID":"d1bb8b0c-5dc0-4325-9475-5118a5e4f336","Type":"ContainerDied","Data":"df5f7e8a7546546b506d99d755b8e3a41b63456b8b6119ddf79a3e682c92423b"} Apr 16 20:45:43.467808 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:43.467378 2577 scope.go:117] "RemoveContainer" containerID="eb029fc2b0bf3d691627751bac311e8ff43e7a0908ec851a7a179272d8f08a4a" Apr 16 20:45:43.467808 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:43.467698 2577 scope.go:117] "RemoveContainer" containerID="df5f7e8a7546546b506d99d755b8e3a41b63456b8b6119ddf79a3e682c92423b" Apr 16 20:45:43.467965 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:45:43.467944 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29606205-prjt4_opendatahub(d1bb8b0c-5dc0-4325-9475-5118a5e4f336)\"" pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" podUID="d1bb8b0c-5dc0-4325-9475-5118a5e4f336" Apr 16 20:45:58.280212 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:58.280174 2577 scope.go:117] "RemoveContainer" containerID="df5f7e8a7546546b506d99d755b8e3a41b63456b8b6119ddf79a3e682c92423b" Apr 16 20:45:59.536726 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:45:59.536692 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" event={"ID":"d1bb8b0c-5dc0-4325-9475-5118a5e4f336","Type":"ContainerStarted","Data":"da802a01c73dc9f6661b8a6465a40cbf09dff6cda6311dffd045c67ec19690b3"} Apr 16 20:46:00.010685 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:00.010650 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606205-prjt4"] Apr 16 20:46:00.540722 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:00.540667 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" podUID="d1bb8b0c-5dc0-4325-9475-5118a5e4f336" containerName="cleanup" containerID="cri-o://da802a01c73dc9f6661b8a6465a40cbf09dff6cda6311dffd045c67ec19690b3" gracePeriod=30 Apr 16 20:46:19.184027 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:19.184001 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" Apr 16 20:46:19.333390 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:19.333303 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j24jd\" (UniqueName: \"kubernetes.io/projected/d1bb8b0c-5dc0-4325-9475-5118a5e4f336-kube-api-access-j24jd\") pod \"d1bb8b0c-5dc0-4325-9475-5118a5e4f336\" (UID: \"d1bb8b0c-5dc0-4325-9475-5118a5e4f336\") " Apr 16 20:46:19.335490 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:19.335433 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1bb8b0c-5dc0-4325-9475-5118a5e4f336-kube-api-access-j24jd" (OuterVolumeSpecName: "kube-api-access-j24jd") pod "d1bb8b0c-5dc0-4325-9475-5118a5e4f336" (UID: "d1bb8b0c-5dc0-4325-9475-5118a5e4f336"). InnerVolumeSpecName "kube-api-access-j24jd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:46:19.434728 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:19.434689 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j24jd\" (UniqueName: \"kubernetes.io/projected/d1bb8b0c-5dc0-4325-9475-5118a5e4f336-kube-api-access-j24jd\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 20:46:19.612424 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:19.612333 2577 generic.go:358] "Generic (PLEG): container finished" podID="d1bb8b0c-5dc0-4325-9475-5118a5e4f336" containerID="da802a01c73dc9f6661b8a6465a40cbf09dff6cda6311dffd045c67ec19690b3" exitCode=6 Apr 16 20:46:19.612424 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:19.612411 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" Apr 16 20:46:19.612610 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:19.612407 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" event={"ID":"d1bb8b0c-5dc0-4325-9475-5118a5e4f336","Type":"ContainerDied","Data":"da802a01c73dc9f6661b8a6465a40cbf09dff6cda6311dffd045c67ec19690b3"} Apr 16 20:46:19.612610 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:19.612512 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606205-prjt4" event={"ID":"d1bb8b0c-5dc0-4325-9475-5118a5e4f336","Type":"ContainerDied","Data":"4c695fddeccb75b66dabd19cce8bfdcd3f6136feca291e95c02c4dd028f23cd0"} Apr 16 20:46:19.612610 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:19.612543 2577 scope.go:117] "RemoveContainer" containerID="da802a01c73dc9f6661b8a6465a40cbf09dff6cda6311dffd045c67ec19690b3" Apr 16 20:46:19.621429 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:19.621411 2577 scope.go:117] "RemoveContainer" containerID="df5f7e8a7546546b506d99d755b8e3a41b63456b8b6119ddf79a3e682c92423b" Apr 16 20:46:19.629156 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:19.629137 2577 scope.go:117] "RemoveContainer" containerID="da802a01c73dc9f6661b8a6465a40cbf09dff6cda6311dffd045c67ec19690b3" Apr 16 20:46:19.629398 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:46:19.629376 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da802a01c73dc9f6661b8a6465a40cbf09dff6cda6311dffd045c67ec19690b3\": container with ID starting with da802a01c73dc9f6661b8a6465a40cbf09dff6cda6311dffd045c67ec19690b3 not found: ID does not exist" containerID="da802a01c73dc9f6661b8a6465a40cbf09dff6cda6311dffd045c67ec19690b3" Apr 16 20:46:19.629455 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:19.629407 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da802a01c73dc9f6661b8a6465a40cbf09dff6cda6311dffd045c67ec19690b3"} err="failed to get container status \"da802a01c73dc9f6661b8a6465a40cbf09dff6cda6311dffd045c67ec19690b3\": rpc error: code = NotFound desc = could not find container \"da802a01c73dc9f6661b8a6465a40cbf09dff6cda6311dffd045c67ec19690b3\": container with ID starting with da802a01c73dc9f6661b8a6465a40cbf09dff6cda6311dffd045c67ec19690b3 not found: ID does not exist" Apr 16 20:46:19.629455 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:19.629424 2577 scope.go:117] "RemoveContainer" containerID="df5f7e8a7546546b506d99d755b8e3a41b63456b8b6119ddf79a3e682c92423b" Apr 16 20:46:19.629664 ip-10-0-132-101 kubenswrapper[2577]: E0416 20:46:19.629645 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df5f7e8a7546546b506d99d755b8e3a41b63456b8b6119ddf79a3e682c92423b\": container with ID starting with df5f7e8a7546546b506d99d755b8e3a41b63456b8b6119ddf79a3e682c92423b not found: ID does not exist" containerID="df5f7e8a7546546b506d99d755b8e3a41b63456b8b6119ddf79a3e682c92423b" Apr 16 20:46:19.629707 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:19.629669 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5f7e8a7546546b506d99d755b8e3a41b63456b8b6119ddf79a3e682c92423b"} err="failed to get container status \"df5f7e8a7546546b506d99d755b8e3a41b63456b8b6119ddf79a3e682c92423b\": rpc error: code = NotFound desc = could not find container \"df5f7e8a7546546b506d99d755b8e3a41b63456b8b6119ddf79a3e682c92423b\": container with ID starting with df5f7e8a7546546b506d99d755b8e3a41b63456b8b6119ddf79a3e682c92423b not found: ID does not exist" Apr 16 20:46:19.633410 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:19.633388 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606205-prjt4"] Apr 16 20:46:19.637651 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:19.637628 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606205-prjt4"] Apr 16 20:46:21.285236 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:46:21.285205 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1bb8b0c-5dc0-4325-9475-5118a5e4f336" path="/var/lib/kubelet/pods/d1bb8b0c-5dc0-4325-9475-5118a5e4f336/volumes" Apr 16 20:47:33.288778 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:47:33.288751 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 20:47:33.292040 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:47:33.292016 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 20:52:33.321906 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:52:33.321872 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 20:52:33.326718 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:52:33.326697 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 20:57:33.357890 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:57:33.357864 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 20:57:33.364074 ip-10-0-132-101 kubenswrapper[2577]: I0416 20:57:33.364049 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 21:00:00.139953 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.139920 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29606220-pg2bb"] Apr 16 21:00:00.140453 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.140283 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1bb8b0c-5dc0-4325-9475-5118a5e4f336" containerName="cleanup" Apr 16 21:00:00.140453 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.140294 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1bb8b0c-5dc0-4325-9475-5118a5e4f336" containerName="cleanup" Apr 16 21:00:00.140453 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.140307 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1bb8b0c-5dc0-4325-9475-5118a5e4f336" containerName="cleanup" Apr 16 21:00:00.140453 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.140314 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1bb8b0c-5dc0-4325-9475-5118a5e4f336" containerName="cleanup" Apr 16 21:00:00.140453 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.140384 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1bb8b0c-5dc0-4325-9475-5118a5e4f336" containerName="cleanup" Apr 16 21:00:00.140453 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.140394 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1bb8b0c-5dc0-4325-9475-5118a5e4f336" containerName="cleanup" Apr 16 21:00:00.140453 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.140401 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1bb8b0c-5dc0-4325-9475-5118a5e4f336" containerName="cleanup" Apr 16 21:00:00.143417 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.143401 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" Apr 16 21:00:00.146124 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.146106 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-6dqgw\"" Apr 16 21:00:00.164171 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.164146 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606220-pg2bb"] Apr 16 21:00:00.227858 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.227822 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-974kq\" (UniqueName: \"kubernetes.io/projected/7658f4f4-f2f4-4d3a-aed1-42c6764e1e07-kube-api-access-974kq\") pod \"maas-api-key-cleanup-29606220-pg2bb\" (UID: \"7658f4f4-f2f4-4d3a-aed1-42c6764e1e07\") " pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" Apr 16 21:00:00.328619 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.328572 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-974kq\" (UniqueName: \"kubernetes.io/projected/7658f4f4-f2f4-4d3a-aed1-42c6764e1e07-kube-api-access-974kq\") pod \"maas-api-key-cleanup-29606220-pg2bb\" (UID: \"7658f4f4-f2f4-4d3a-aed1-42c6764e1e07\") " pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" Apr 16 21:00:00.337040 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.337019 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-974kq\" (UniqueName: \"kubernetes.io/projected/7658f4f4-f2f4-4d3a-aed1-42c6764e1e07-kube-api-access-974kq\") pod \"maas-api-key-cleanup-29606220-pg2bb\" (UID: \"7658f4f4-f2f4-4d3a-aed1-42c6764e1e07\") " pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" Apr 16 21:00:00.453171 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.453141 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" Apr 16 21:00:00.576644 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.576620 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606220-pg2bb"] Apr 16 21:00:00.579104 ip-10-0-132-101 kubenswrapper[2577]: W0416 21:00:00.579077 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7658f4f4_f2f4_4d3a_aed1_42c6764e1e07.slice/crio-ef2d98f10a627f360de19168eb91390e25a20d54f8e2b87ab704ef50daa19f61 WatchSource:0}: Error finding container ef2d98f10a627f360de19168eb91390e25a20d54f8e2b87ab704ef50daa19f61: Status 404 returned error can't find the container with id ef2d98f10a627f360de19168eb91390e25a20d54f8e2b87ab704ef50daa19f61 Apr 16 21:00:00.580849 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.580831 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:00:00.742011 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:00.741937 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" event={"ID":"7658f4f4-f2f4-4d3a-aed1-42c6764e1e07","Type":"ContainerStarted","Data":"ef2d98f10a627f360de19168eb91390e25a20d54f8e2b87ab704ef50daa19f61"} Apr 16 21:00:01.747684 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:01.747636 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" event={"ID":"7658f4f4-f2f4-4d3a-aed1-42c6764e1e07","Type":"ContainerStarted","Data":"e9aa45abf435cc3d3e107736ccc0a1619976d2873317dbb8e986c0281778e52e"} Apr 16 21:00:01.764672 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:01.764628 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" podStartSLOduration=1.7646130690000001 podStartE2EDuration="1.764613069s" podCreationTimestamp="2026-04-16 21:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:00:01.762925172 +0000 UTC m=+1949.031845583" watchObservedRunningTime="2026-04-16 21:00:01.764613069 +0000 UTC m=+1949.033533478" Apr 16 21:00:21.827161 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:21.827122 2577 generic.go:358] "Generic (PLEG): container finished" podID="7658f4f4-f2f4-4d3a-aed1-42c6764e1e07" containerID="e9aa45abf435cc3d3e107736ccc0a1619976d2873317dbb8e986c0281778e52e" exitCode=6 Apr 16 21:00:21.827675 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:21.827197 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" event={"ID":"7658f4f4-f2f4-4d3a-aed1-42c6764e1e07","Type":"ContainerDied","Data":"e9aa45abf435cc3d3e107736ccc0a1619976d2873317dbb8e986c0281778e52e"} Apr 16 21:00:21.827675 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:21.827625 2577 scope.go:117] "RemoveContainer" containerID="e9aa45abf435cc3d3e107736ccc0a1619976d2873317dbb8e986c0281778e52e" Apr 16 21:00:22.832125 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:22.832085 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" event={"ID":"7658f4f4-f2f4-4d3a-aed1-42c6764e1e07","Type":"ContainerStarted","Data":"959eb098fd1a9c31c6ceeec42fadc56b5b524038e326c0449eaa0b28b2f1e96f"} Apr 16 21:00:42.909260 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:42.909169 2577 generic.go:358] "Generic (PLEG): container finished" podID="7658f4f4-f2f4-4d3a-aed1-42c6764e1e07" containerID="959eb098fd1a9c31c6ceeec42fadc56b5b524038e326c0449eaa0b28b2f1e96f" exitCode=6 Apr 16 21:00:42.909260 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:42.909212 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" event={"ID":"7658f4f4-f2f4-4d3a-aed1-42c6764e1e07","Type":"ContainerDied","Data":"959eb098fd1a9c31c6ceeec42fadc56b5b524038e326c0449eaa0b28b2f1e96f"} Apr 16 21:00:42.909260 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:42.909243 2577 scope.go:117] "RemoveContainer" containerID="e9aa45abf435cc3d3e107736ccc0a1619976d2873317dbb8e986c0281778e52e" Apr 16 21:00:42.909780 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:42.909574 2577 scope.go:117] "RemoveContainer" containerID="959eb098fd1a9c31c6ceeec42fadc56b5b524038e326c0449eaa0b28b2f1e96f" Apr 16 21:00:42.909821 ip-10-0-132-101 kubenswrapper[2577]: E0416 21:00:42.909802 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29606220-pg2bb_opendatahub(7658f4f4-f2f4-4d3a-aed1-42c6764e1e07)\"" pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" podUID="7658f4f4-f2f4-4d3a-aed1-42c6764e1e07" Apr 16 21:00:53.285756 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:53.285719 2577 scope.go:117] "RemoveContainer" containerID="959eb098fd1a9c31c6ceeec42fadc56b5b524038e326c0449eaa0b28b2f1e96f" Apr 16 21:00:53.953050 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:53.953018 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" event={"ID":"7658f4f4-f2f4-4d3a-aed1-42c6764e1e07","Type":"ContainerStarted","Data":"c838683359dc6847eff5eed170c82d98b46a043228cf5b9e561f228e6dbd8528"} Apr 16 21:00:54.329988 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:54.329903 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606220-pg2bb"] Apr 16 21:00:54.957211 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:00:54.957140 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" podUID="7658f4f4-f2f4-4d3a-aed1-42c6764e1e07" containerName="cleanup" containerID="cri-o://c838683359dc6847eff5eed170c82d98b46a043228cf5b9e561f228e6dbd8528" gracePeriod=30 Apr 16 21:01:14.031404 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:01:14.031070 2577 generic.go:358] "Generic (PLEG): container finished" podID="7658f4f4-f2f4-4d3a-aed1-42c6764e1e07" containerID="c838683359dc6847eff5eed170c82d98b46a043228cf5b9e561f228e6dbd8528" exitCode=6 Apr 16 21:01:14.031404 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:01:14.031119 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" event={"ID":"7658f4f4-f2f4-4d3a-aed1-42c6764e1e07","Type":"ContainerDied","Data":"c838683359dc6847eff5eed170c82d98b46a043228cf5b9e561f228e6dbd8528"} Apr 16 21:01:14.031404 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:01:14.031155 2577 scope.go:117] "RemoveContainer" containerID="959eb098fd1a9c31c6ceeec42fadc56b5b524038e326c0449eaa0b28b2f1e96f" Apr 16 21:01:14.101380 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:01:14.101357 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" Apr 16 21:01:14.166133 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:01:14.166101 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-974kq\" (UniqueName: \"kubernetes.io/projected/7658f4f4-f2f4-4d3a-aed1-42c6764e1e07-kube-api-access-974kq\") pod \"7658f4f4-f2f4-4d3a-aed1-42c6764e1e07\" (UID: \"7658f4f4-f2f4-4d3a-aed1-42c6764e1e07\") " Apr 16 21:01:14.168288 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:01:14.168265 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7658f4f4-f2f4-4d3a-aed1-42c6764e1e07-kube-api-access-974kq" (OuterVolumeSpecName: "kube-api-access-974kq") pod "7658f4f4-f2f4-4d3a-aed1-42c6764e1e07" (UID: "7658f4f4-f2f4-4d3a-aed1-42c6764e1e07"). InnerVolumeSpecName "kube-api-access-974kq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 21:01:14.267577 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:01:14.267447 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-974kq\" (UniqueName: \"kubernetes.io/projected/7658f4f4-f2f4-4d3a-aed1-42c6764e1e07-kube-api-access-974kq\") on node \"ip-10-0-132-101.ec2.internal\" DevicePath \"\"" Apr 16 21:01:15.036787 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:01:15.036758 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" Apr 16 21:01:15.037228 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:01:15.036786 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606220-pg2bb" event={"ID":"7658f4f4-f2f4-4d3a-aed1-42c6764e1e07","Type":"ContainerDied","Data":"ef2d98f10a627f360de19168eb91390e25a20d54f8e2b87ab704ef50daa19f61"} Apr 16 21:01:15.037228 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:01:15.036830 2577 scope.go:117] "RemoveContainer" containerID="c838683359dc6847eff5eed170c82d98b46a043228cf5b9e561f228e6dbd8528" Apr 16 21:01:15.062165 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:01:15.062135 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606220-pg2bb"] Apr 16 21:01:15.065308 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:01:15.065282 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606220-pg2bb"] Apr 16 21:01:15.285622 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:01:15.285591 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7658f4f4-f2f4-4d3a-aed1-42c6764e1e07" path="/var/lib/kubelet/pods/7658f4f4-f2f4-4d3a-aed1-42c6764e1e07/volumes" Apr 16 21:02:23.229400 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:23.229367 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7df45c5567-ndg7p_13df1003-8d71-4254-9196-cb2a9b0369fe/authorino/0.log" Apr 16 21:02:27.417527 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:27.417498 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-96d5c496f-7p77l_d1c1e0c0-b0e7-45e7-9001-6252bc927725/manager/0.log" Apr 16 21:02:27.654300 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:27.654253 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cc777b675-b9vl2_0d2e4714-1583-4a94-967a-9e28c0789279/manager/0.log" Apr 16 21:02:28.889432 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:28.889394 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7_8051c81e-8645-464d-83f1-69568256cc62/util/0.log" Apr 16 21:02:28.899752 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:28.899729 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7_8051c81e-8645-464d-83f1-69568256cc62/pull/0.log" Apr 16 21:02:28.908539 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:28.908520 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7_8051c81e-8645-464d-83f1-69568256cc62/extract/0.log" Apr 16 21:02:29.020298 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:29.020263 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj_165b09d2-b5da-4790-90e1-fb3e80407ae7/util/0.log" Apr 16 21:02:29.028074 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:29.028051 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj_165b09d2-b5da-4790-90e1-fb3e80407ae7/pull/0.log" Apr 16 21:02:29.039978 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:29.039957 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj_165b09d2-b5da-4790-90e1-fb3e80407ae7/extract/0.log" Apr 16 21:02:29.154528 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:29.154424 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft_c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1/util/0.log" Apr 16 21:02:29.168040 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:29.168004 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft_c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1/pull/0.log" Apr 16 21:02:29.177506 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:29.177489 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft_c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1/extract/0.log" Apr 16 21:02:29.289856 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:29.289831 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr_f24b5e31-4ddd-4b62-b158-cc3346e320fe/util/0.log" Apr 16 21:02:29.297351 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:29.297328 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr_f24b5e31-4ddd-4b62-b158-cc3346e320fe/pull/0.log" Apr 16 21:02:29.305284 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:29.305263 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr_f24b5e31-4ddd-4b62-b158-cc3346e320fe/extract/0.log" Apr 16 21:02:29.420672 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:29.420649 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7df45c5567-ndg7p_13df1003-8d71-4254-9196-cb2a9b0369fe/authorino/0.log" Apr 16 21:02:29.541393 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:29.541349 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-lhfd9_6733ba7b-b41e-49aa-83a1-d9182b15b58b/manager/0.log" Apr 16 21:02:29.650084 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:29.650055 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-cqqdz_918b6360-89b2-4953-9df8-a6962b44bb8f/manager/0.log" Apr 16 21:02:29.894257 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:29.894173 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-cglvc_3da3d1bd-62b1-4e70-90b9-c2c4960f4461/registry-server/0.log" Apr 16 21:02:30.287898 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:30.287867 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-vq99q_1aa88fac-6e4d-43a9-b227-e0c2f32adc1d/manager/0.log" Apr 16 21:02:30.647279 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:30.647205 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n_fa4767c5-9765-4d74-bd02-c64bde512afe/istio-proxy/0.log" Apr 16 21:02:31.116803 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:31.116768 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-pjmcv_1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a/istio-proxy/0.log" Apr 16 21:02:31.233243 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:31.233212 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-cc5b6bc9d-d9pl5_b0125b55-3e0c-4bba-b620-3460a3974959/router/0.log" Apr 16 21:02:33.387209 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:33.387180 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 21:02:33.395496 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:33.395450 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 21:02:36.444818 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.444786 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hdxxb/must-gather-qc96k"] Apr 16 21:02:36.445202 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.445166 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1bb8b0c-5dc0-4325-9475-5118a5e4f336" containerName="cleanup" Apr 16 21:02:36.445202 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.445176 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1bb8b0c-5dc0-4325-9475-5118a5e4f336" containerName="cleanup" Apr 16 21:02:36.445202 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.445185 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7658f4f4-f2f4-4d3a-aed1-42c6764e1e07" containerName="cleanup" Apr 16 21:02:36.445202 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.445191 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7658f4f4-f2f4-4d3a-aed1-42c6764e1e07" containerName="cleanup" Apr 16 21:02:36.445361 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.445211 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7658f4f4-f2f4-4d3a-aed1-42c6764e1e07" containerName="cleanup" Apr 16 21:02:36.445361 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.445217 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7658f4f4-f2f4-4d3a-aed1-42c6764e1e07" containerName="cleanup" Apr 16 21:02:36.445361 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.445273 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7658f4f4-f2f4-4d3a-aed1-42c6764e1e07" containerName="cleanup" Apr 16 21:02:36.445361 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.445283 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7658f4f4-f2f4-4d3a-aed1-42c6764e1e07" containerName="cleanup" Apr 16 21:02:36.445361 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.445333 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7658f4f4-f2f4-4d3a-aed1-42c6764e1e07" containerName="cleanup" Apr 16 21:02:36.445361 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.445339 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7658f4f4-f2f4-4d3a-aed1-42c6764e1e07" containerName="cleanup" Apr 16 21:02:36.445851 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.445403 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7658f4f4-f2f4-4d3a-aed1-42c6764e1e07" containerName="cleanup" Apr 16 21:02:36.448688 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.448671 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hdxxb/must-gather-qc96k" Apr 16 21:02:36.451787 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.451763 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-hdxxb\"/\"default-dockercfg-brd7j\"" Apr 16 21:02:36.451890 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.451807 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hdxxb\"/\"openshift-service-ca.crt\"" Apr 16 21:02:36.451890 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.451874 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hdxxb\"/\"kube-root-ca.crt\"" Apr 16 21:02:36.458936 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.458915 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hdxxb/must-gather-qc96k"] Apr 16 21:02:36.575374 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.575344 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/027d9284-34cf-45f4-8ac9-9847de79bfcb-must-gather-output\") pod \"must-gather-qc96k\" (UID: \"027d9284-34cf-45f4-8ac9-9847de79bfcb\") " pod="openshift-must-gather-hdxxb/must-gather-qc96k" Apr 16 21:02:36.575576 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.575380 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf7nm\" (UniqueName: \"kubernetes.io/projected/027d9284-34cf-45f4-8ac9-9847de79bfcb-kube-api-access-cf7nm\") pod \"must-gather-qc96k\" (UID: \"027d9284-34cf-45f4-8ac9-9847de79bfcb\") " pod="openshift-must-gather-hdxxb/must-gather-qc96k" Apr 16 21:02:36.676616 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.676578 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/027d9284-34cf-45f4-8ac9-9847de79bfcb-must-gather-output\") pod \"must-gather-qc96k\" (UID: \"027d9284-34cf-45f4-8ac9-9847de79bfcb\") " pod="openshift-must-gather-hdxxb/must-gather-qc96k" Apr 16 21:02:36.676616 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.676623 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cf7nm\" (UniqueName: \"kubernetes.io/projected/027d9284-34cf-45f4-8ac9-9847de79bfcb-kube-api-access-cf7nm\") pod \"must-gather-qc96k\" (UID: \"027d9284-34cf-45f4-8ac9-9847de79bfcb\") " pod="openshift-must-gather-hdxxb/must-gather-qc96k" Apr 16 21:02:36.676984 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.676962 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/027d9284-34cf-45f4-8ac9-9847de79bfcb-must-gather-output\") pod \"must-gather-qc96k\" (UID: \"027d9284-34cf-45f4-8ac9-9847de79bfcb\") " pod="openshift-must-gather-hdxxb/must-gather-qc96k" Apr 16 21:02:36.685277 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.685248 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf7nm\" (UniqueName: \"kubernetes.io/projected/027d9284-34cf-45f4-8ac9-9847de79bfcb-kube-api-access-cf7nm\") pod \"must-gather-qc96k\" (UID: \"027d9284-34cf-45f4-8ac9-9847de79bfcb\") " pod="openshift-must-gather-hdxxb/must-gather-qc96k" Apr 16 21:02:36.759638 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.759551 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hdxxb/must-gather-qc96k" Apr 16 21:02:36.890323 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:36.890284 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hdxxb/must-gather-qc96k"] Apr 16 21:02:36.891569 ip-10-0-132-101 kubenswrapper[2577]: W0416 21:02:36.891525 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod027d9284_34cf_45f4_8ac9_9847de79bfcb.slice/crio-9c71e5f9e1ec72c95fd42178997761726acfecbb48c3cefdc77ae328474a13a8 WatchSource:0}: Error finding container 9c71e5f9e1ec72c95fd42178997761726acfecbb48c3cefdc77ae328474a13a8: Status 404 returned error can't find the container with id 9c71e5f9e1ec72c95fd42178997761726acfecbb48c3cefdc77ae328474a13a8 Apr 16 21:02:37.351454 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:37.351419 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hdxxb/must-gather-qc96k" event={"ID":"027d9284-34cf-45f4-8ac9-9847de79bfcb","Type":"ContainerStarted","Data":"9c71e5f9e1ec72c95fd42178997761726acfecbb48c3cefdc77ae328474a13a8"} Apr 16 21:02:38.359393 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:38.359297 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hdxxb/must-gather-qc96k" event={"ID":"027d9284-34cf-45f4-8ac9-9847de79bfcb","Type":"ContainerStarted","Data":"6cda1a134dc61dd870a15043c22b3de95018d92116c37468dd43af8eb6494f86"} Apr 16 21:02:38.359393 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:38.359350 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hdxxb/must-gather-qc96k" event={"ID":"027d9284-34cf-45f4-8ac9-9847de79bfcb","Type":"ContainerStarted","Data":"ee8e032872cbd11c4ec8c0b020e0836fab89459e1a6dfc621508849ba6f7ca70"} Apr 16 21:02:38.386879 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:38.386829 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hdxxb/must-gather-qc96k" podStartSLOduration=1.474333656 podStartE2EDuration="2.386813032s" podCreationTimestamp="2026-04-16 21:02:36 +0000 UTC" firstStartedPulling="2026-04-16 21:02:36.893259458 +0000 UTC m=+2104.162179846" lastFinishedPulling="2026-04-16 21:02:37.805738821 +0000 UTC m=+2105.074659222" observedRunningTime="2026-04-16 21:02:38.385746347 +0000 UTC m=+2105.654666757" watchObservedRunningTime="2026-04-16 21:02:38.386813032 +0000 UTC m=+2105.655733444" Apr 16 21:02:39.527809 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:39.527778 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-f6vnk_3412a3da-a76f-4f08-b537-12d8e7e96c9d/global-pull-secret-syncer/0.log" Apr 16 21:02:39.595732 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:39.595692 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9tzkv_8211bddd-6e75-435d-865a-caf384cfefad/konnectivity-agent/0.log" Apr 16 21:02:39.681153 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:39.681124 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-101.ec2.internal_5e83c939a004f23d83533d2ed83f17f1/haproxy/0.log" Apr 16 21:02:43.386367 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:43.386315 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7_8051c81e-8645-464d-83f1-69568256cc62/extract/0.log" Apr 16 21:02:43.413090 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:43.413058 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7_8051c81e-8645-464d-83f1-69568256cc62/util/0.log" Apr 16 21:02:43.439225 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:43.439199 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xv7p7_8051c81e-8645-464d-83f1-69568256cc62/pull/0.log" Apr 16 21:02:43.466356 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:43.466320 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj_165b09d2-b5da-4790-90e1-fb3e80407ae7/extract/0.log" Apr 16 21:02:43.501298 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:43.501267 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj_165b09d2-b5da-4790-90e1-fb3e80407ae7/util/0.log" Apr 16 21:02:43.526514 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:43.526480 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07mlmj_165b09d2-b5da-4790-90e1-fb3e80407ae7/pull/0.log" Apr 16 21:02:43.558959 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:43.558926 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft_c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1/extract/0.log" Apr 16 21:02:43.581313 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:43.581264 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft_c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1/util/0.log" Apr 16 21:02:43.620668 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:43.620617 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xwlft_c929b21c-5a05-4bd7-bbb0-2a86f8d1d9b1/pull/0.log" Apr 16 21:02:43.676540 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:43.676512 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr_f24b5e31-4ddd-4b62-b158-cc3346e320fe/extract/0.log" Apr 16 21:02:43.740119 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:43.740088 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr_f24b5e31-4ddd-4b62-b158-cc3346e320fe/util/0.log" Apr 16 21:02:43.794203 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:43.794169 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef122wqr_f24b5e31-4ddd-4b62-b158-cc3346e320fe/pull/0.log" Apr 16 21:02:44.046535 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:44.046411 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7df45c5567-ndg7p_13df1003-8d71-4254-9196-cb2a9b0369fe/authorino/0.log" Apr 16 21:02:44.078494 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:44.077195 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-lhfd9_6733ba7b-b41e-49aa-83a1-d9182b15b58b/manager/0.log" Apr 16 21:02:44.112853 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:44.112825 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-cqqdz_918b6360-89b2-4953-9df8-a6962b44bb8f/manager/0.log" Apr 16 21:02:44.176559 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:44.176522 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-cglvc_3da3d1bd-62b1-4e70-90b9-c2c4960f4461/registry-server/0.log" Apr 16 21:02:44.364406 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:44.364314 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-vq99q_1aa88fac-6e4d-43a9-b227-e0c2f32adc1d/manager/0.log" Apr 16 21:02:46.056101 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:46.056013 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-tpq9c_da561835-fd55-453b-91fa-23a89f82a5f3/cluster-monitoring-operator/0.log" Apr 16 21:02:46.213544 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:46.213394 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-98fcz_bb3753c6-0017-4c11-a7e8-751eda08c472/node-exporter/0.log" Apr 16 21:02:46.231007 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:46.230977 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-98fcz_bb3753c6-0017-4c11-a7e8-751eda08c472/kube-rbac-proxy/0.log" Apr 16 21:02:46.257997 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:46.257968 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-98fcz_bb3753c6-0017-4c11-a7e8-751eda08c472/init-textfile/0.log" Apr 16 21:02:46.441141 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:46.441110 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vqrb4_14437e71-f1f0-4889-bc9e-6d4684ff0b5c/kube-rbac-proxy-main/0.log" Apr 16 21:02:46.462589 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:46.462559 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vqrb4_14437e71-f1f0-4889-bc9e-6d4684ff0b5c/kube-rbac-proxy-self/0.log" Apr 16 21:02:46.493146 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:46.493117 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vqrb4_14437e71-f1f0-4889-bc9e-6d4684ff0b5c/openshift-state-metrics/0.log" Apr 16 21:02:46.683994 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:46.683958 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-lpkm8_63d1762d-5ddc-41c4-a3d3-575e6308e36e/prometheus-operator/0.log" Apr 16 21:02:46.705292 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:46.705216 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-lpkm8_63d1762d-5ddc-41c4-a3d3-575e6308e36e/kube-rbac-proxy/0.log" Apr 16 21:02:46.731001 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:46.730973 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-zkfvp_28ba2caa-20e5-4406-8427-83e5469c3854/prometheus-operator-admission-webhook/0.log" Apr 16 21:02:47.932308 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:47.932275 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m"] Apr 16 21:02:47.936907 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:47.936881 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:47.946044 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:47.946017 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m"] Apr 16 21:02:48.030677 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.030635 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/98824469-82d0-41d4-9e92-1c6692676d5d-lib-modules\") pod \"perf-node-gather-daemonset-fbn4m\" (UID: \"98824469-82d0-41d4-9e92-1c6692676d5d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:48.030890 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.030701 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxbfw\" (UniqueName: \"kubernetes.io/projected/98824469-82d0-41d4-9e92-1c6692676d5d-kube-api-access-kxbfw\") pod \"perf-node-gather-daemonset-fbn4m\" (UID: \"98824469-82d0-41d4-9e92-1c6692676d5d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:48.030890 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.030742 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98824469-82d0-41d4-9e92-1c6692676d5d-sys\") pod \"perf-node-gather-daemonset-fbn4m\" (UID: \"98824469-82d0-41d4-9e92-1c6692676d5d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:48.030890 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.030776 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/98824469-82d0-41d4-9e92-1c6692676d5d-podres\") pod \"perf-node-gather-daemonset-fbn4m\" (UID: \"98824469-82d0-41d4-9e92-1c6692676d5d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:48.030890 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.030858 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/98824469-82d0-41d4-9e92-1c6692676d5d-proc\") pod \"perf-node-gather-daemonset-fbn4m\" (UID: \"98824469-82d0-41d4-9e92-1c6692676d5d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:48.131663 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.131622 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxbfw\" (UniqueName: \"kubernetes.io/projected/98824469-82d0-41d4-9e92-1c6692676d5d-kube-api-access-kxbfw\") pod \"perf-node-gather-daemonset-fbn4m\" (UID: \"98824469-82d0-41d4-9e92-1c6692676d5d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:48.131876 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.131762 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98824469-82d0-41d4-9e92-1c6692676d5d-sys\") pod \"perf-node-gather-daemonset-fbn4m\" (UID: \"98824469-82d0-41d4-9e92-1c6692676d5d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:48.131876 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.131802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/98824469-82d0-41d4-9e92-1c6692676d5d-podres\") pod \"perf-node-gather-daemonset-fbn4m\" (UID: \"98824469-82d0-41d4-9e92-1c6692676d5d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:48.131876 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.131840 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/98824469-82d0-41d4-9e92-1c6692676d5d-proc\") pod \"perf-node-gather-daemonset-fbn4m\" (UID: \"98824469-82d0-41d4-9e92-1c6692676d5d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:48.132005 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.131872 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98824469-82d0-41d4-9e92-1c6692676d5d-sys\") pod \"perf-node-gather-daemonset-fbn4m\" (UID: \"98824469-82d0-41d4-9e92-1c6692676d5d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:48.132005 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.131949 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/98824469-82d0-41d4-9e92-1c6692676d5d-proc\") pod \"perf-node-gather-daemonset-fbn4m\" (UID: \"98824469-82d0-41d4-9e92-1c6692676d5d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:48.132005 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.131953 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/98824469-82d0-41d4-9e92-1c6692676d5d-lib-modules\") pod \"perf-node-gather-daemonset-fbn4m\" (UID: \"98824469-82d0-41d4-9e92-1c6692676d5d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:48.132111 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.132002 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/98824469-82d0-41d4-9e92-1c6692676d5d-podres\") pod \"perf-node-gather-daemonset-fbn4m\" (UID: \"98824469-82d0-41d4-9e92-1c6692676d5d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:48.132111 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.132042 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/98824469-82d0-41d4-9e92-1c6692676d5d-lib-modules\") pod \"perf-node-gather-daemonset-fbn4m\" (UID: \"98824469-82d0-41d4-9e92-1c6692676d5d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:48.139812 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.139785 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxbfw\" (UniqueName: \"kubernetes.io/projected/98824469-82d0-41d4-9e92-1c6692676d5d-kube-api-access-kxbfw\") pod \"perf-node-gather-daemonset-fbn4m\" (UID: \"98824469-82d0-41d4-9e92-1c6692676d5d\") " pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:48.249521 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.249405 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:48.416237 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.416207 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m"] Apr 16 21:02:48.418141 ip-10-0-132-101 kubenswrapper[2577]: W0416 21:02:48.418089 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod98824469_82d0_41d4_9e92_1c6692676d5d.slice/crio-edfcca553cbec8d8e84f6ae601a41b6b3df240f543345bdd3efeb03225000138 WatchSource:0}: Error finding container edfcca553cbec8d8e84f6ae601a41b6b3df240f543345bdd3efeb03225000138: Status 404 returned error can't find the container with id edfcca553cbec8d8e84f6ae601a41b6b3df240f543345bdd3efeb03225000138 Apr 16 21:02:48.987297 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:48.987265 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cd5f8c8f8-pcrpc_42e7fcdb-a2f7-4a91-88f6-cc1acc359a83/console/0.log" Apr 16 21:02:49.018506 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:49.018456 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-6pptv_b420aa5e-b5f4-4c5a-963e-ff55196e4ca9/download-server/0.log" Apr 16 21:02:49.427354 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:49.427315 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" event={"ID":"98824469-82d0-41d4-9e92-1c6692676d5d","Type":"ContainerStarted","Data":"0782bea9d2b459efce4ea452bb96cd4c3c41ff10053a0c8f3bf5383f89e45f0e"} Apr 16 21:02:49.427579 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:49.427362 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" event={"ID":"98824469-82d0-41d4-9e92-1c6692676d5d","Type":"ContainerStarted","Data":"edfcca553cbec8d8e84f6ae601a41b6b3df240f543345bdd3efeb03225000138"} Apr 16 21:02:49.427579 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:49.427397 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:49.447189 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:49.447136 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" podStartSLOduration=2.44711793 podStartE2EDuration="2.44711793s" podCreationTimestamp="2026-04-16 21:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:02:49.444393019 +0000 UTC m=+2116.713313429" watchObservedRunningTime="2026-04-16 21:02:49.44711793 +0000 UTC m=+2116.716038363" Apr 16 21:02:50.262256 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:50.262225 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xbszw_a2a1535c-c8dc-4688-a07f-00a01b4dec34/dns/0.log" Apr 16 21:02:50.282698 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:50.282672 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xbszw_a2a1535c-c8dc-4688-a07f-00a01b4dec34/kube-rbac-proxy/0.log" Apr 16 21:02:50.327671 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:50.327639 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jk4m8_41119468-3774-48bd-98be-d49ab3625162/dns-node-resolver/0.log" Apr 16 21:02:50.785985 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:50.785951 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7c9d75854d-g48xr_8d5a6299-5ec2-4b7a-9105-5e52be9dc830/registry/0.log" Apr 16 21:02:50.870019 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:50.869991 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-m848r_8df02eba-eb01-4603-87bd-76a281217485/node-ca/0.log" Apr 16 21:02:51.726385 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:51.726349 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfcrj6n_fa4767c5-9765-4d74-bd02-c64bde512afe/istio-proxy/0.log" Apr 16 21:02:52.005096 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:52.005018 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-pjmcv_1ad42fac-c0e0-4e58-bc1c-bb9fbf5bc66a/istio-proxy/0.log" Apr 16 21:02:52.029947 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:52.029919 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-cc5b6bc9d-d9pl5_b0125b55-3e0c-4bba-b620-3460a3974959/router/0.log" Apr 16 21:02:52.592375 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:52.592345 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-skwc8_f1de9e75-c8b2-4fee-898a-82488ff8d677/serve-healthcheck-canary/0.log" Apr 16 21:02:53.228694 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:53.228660 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rrklb_bab34410-b402-43af-930c-05dbd9430ae8/kube-rbac-proxy/0.log" Apr 16 21:02:53.248318 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:53.248291 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rrklb_bab34410-b402-43af-930c-05dbd9430ae8/exporter/0.log" Apr 16 21:02:53.274139 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:53.274116 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rrklb_bab34410-b402-43af-930c-05dbd9430ae8/extractor/0.log" Apr 16 21:02:55.308585 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:55.308555 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-96d5c496f-7p77l_d1c1e0c0-b0e7-45e7-9001-6252bc927725/manager/0.log" Apr 16 21:02:55.369494 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:55.369453 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6cc777b675-b9vl2_0d2e4714-1583-4a94-967a-9e28c0789279/manager/0.log" Apr 16 21:02:55.443724 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:55.443698 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-hdxxb/perf-node-gather-daemonset-fbn4m" Apr 16 21:02:56.752736 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:02:56.752693 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7b555bff64-z6t98_d7128bde-4264-4df6-b066-f6f9e789cd5f/manager/0.log" Apr 16 21:03:01.510829 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:01.510766 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-kqkjb_1a4fa98c-05e4-48fe-93c7-d01bd593d03a/kube-storage-version-migrator-operator/1.log" Apr 16 21:03:01.511538 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:01.511521 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-kqkjb_1a4fa98c-05e4-48fe-93c7-d01bd593d03a/kube-storage-version-migrator-operator/0.log" Apr 16 21:03:02.669825 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:02.669795 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-264w9_a8845fea-ade3-4e74-b157-294175ce8b1a/kube-multus/0.log" Apr 16 21:03:03.048526 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:03.048434 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vvztc_ffb263cb-6f76-4dfe-a02a-435624b83457/kube-multus-additional-cni-plugins/0.log" Apr 16 21:03:03.072746 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:03.072726 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vvztc_ffb263cb-6f76-4dfe-a02a-435624b83457/egress-router-binary-copy/0.log" Apr 16 21:03:03.095212 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:03.095189 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vvztc_ffb263cb-6f76-4dfe-a02a-435624b83457/cni-plugins/0.log" Apr 16 21:03:03.124971 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:03.124949 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vvztc_ffb263cb-6f76-4dfe-a02a-435624b83457/bond-cni-plugin/0.log" Apr 16 21:03:03.150007 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:03.149988 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vvztc_ffb263cb-6f76-4dfe-a02a-435624b83457/routeoverride-cni/0.log" Apr 16 21:03:03.169950 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:03.169924 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vvztc_ffb263cb-6f76-4dfe-a02a-435624b83457/whereabouts-cni-bincopy/0.log" Apr 16 21:03:03.190842 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:03.190818 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vvztc_ffb263cb-6f76-4dfe-a02a-435624b83457/whereabouts-cni/0.log" Apr 16 21:03:03.249967 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:03.249902 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-b8p9v_30274609-546d-4c7b-abd0-8907fd0a6cd7/network-metrics-daemon/0.log" Apr 16 21:03:03.270641 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:03.270617 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-b8p9v_30274609-546d-4c7b-abd0-8907fd0a6cd7/kube-rbac-proxy/0.log" Apr 16 21:03:04.464345 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:04.464312 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-controller/0.log" Apr 16 21:03:04.481121 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:04.481090 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/0.log" Apr 16 21:03:04.494160 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:04.494125 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovn-acl-logging/1.log" Apr 16 21:03:04.516675 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:04.516646 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/kube-rbac-proxy-node/0.log" Apr 16 21:03:04.538030 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:04.537998 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 21:03:04.557506 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:04.557453 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/northd/0.log" Apr 16 21:03:04.581362 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:04.581328 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/nbdb/0.log" Apr 16 21:03:04.609350 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:04.609328 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/sbdb/0.log" Apr 16 21:03:04.721219 ip-10-0-132-101 kubenswrapper[2577]: I0416 21:03:04.721124 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5gm4x_2a1c475f-d8da-4de3-9d2f-33da4c16e0fa/ovnkube-controller/0.log"